Jump to content
We've recently updated our Privacy Statement, available here ×

mgeise

Members
  • Posts

    584
  • Joined

  • Last visited

 Content Type 

Profiles

Forum

Events

Featured Visualizations

Knowledge Base

Documentation (PDF Downloads)

Blog

Documentation (Test Area)

Documentation

Dr. Jaspersoft Webinar Series

Downloads

Everything posted by mgeise

  1. Make sure that the role has execute permissions on all of the resources used by the report. For visibility to the report, make sure that they have read access to all of the subdirectories too. See if that works.
  2. Thanks...we are putting additional tools in place. We have been busy getting our release posted, etc. and will have this issue under control very soon.
  3. What is the error you are getting? Does the field referenced within it exist and is it populated by your main query?
  4. When you go through the Amazon Marketplace page, you should be given a set of instance sizes to choose from. You can see the list and associated pricing on this page: https://aws.amazon.com/marketplace/pp/B00B527JQ0/ref=jaspersoft_ptnr_aws_web Additional information on getting started can be found here: http://community.jaspersoft.com/jaspersoft-aws
  5. If the values are stored in the database already, you might consider using a query based input control. Aside from that you would be likely looking at increasing the size of some DB columns and trying to modify the form validation.
  6. Assuming that it is not just a giant report and it will occassionally return fast and sometimes slow - When it happens, you may want to check the memory utilization on your server. It is likely that you are running low on memory. Systems can slow down when they start utilizing the swap space. This could be caused by other people running alot of concurrent reports combined with reports running on a schedule, etc. Running your server with plenty of memory and tuning your garbage collection, etc. (Some of the recommended parameters are found in the installation guide) can improve the performance of your reports quite a bit.
  7. Absolutely...going from the trial of the commercial edition to a fully licensed version is a simple matter of putting a license key in place.
  8. 5.5.0a is simply a repackaging of 5.5.0. If you are already running 5.5.0, there is no need to change...however if you are installing new, 5.5.0a is the current packaging of the 5.5.0 release.
  9. Your issues are very likely caused by a conflict in Glassfish 4. Glassfish is not certified for the community edition and only for versions 2.1, 3.0, and 3.1.2 for the commercial edition. http://www.jaspersoft.com/sites/default/files/Jaspersoft%20Platform%20Support%20V5.5.pdf
  10. It is likely that your DB is not accepting connections from the AWS instance. If your DB is also on AWS you will need to make sure that it is in the same security group, that the security group has the appropriate permissions on the port you are trying to connect to, etc. I hope this helps
  11. Is it possible that you are looking at the wrong query? It seems to be complaining about the query for the "Comparison Graph." Potentially this is a query that is generated from a subreport or a chart that you added separate from the main report query. You might try turning the logging up on the server or even try catching the query in the DB to see what the actual query is. It is definitely a problem with the query that is getting sent to the DB. Poking around the internet a bit, I found that this has been seen and the response was: I got this error when I was using PostgreSQL to query an Amazon Redshift database. To fix it, I had to change the RANK function to this Redshift-specific function: SUM (1) OVER ( PARTITION BY field1 ORDER BY field2 ROWS UNBOUNDED PRECEDING) Another item that I found that shows why this might happen is this: In the result, the database returns four records. If we want to get only one entry for every name, we can use DISTINCT: SELECT DISTINCT name FROM customer INTERSECT SELECT DISTINCT name FROM sales; In the example, we have to use DISTINCT in both SELECT statements; otherwise, an error is displayed: SELECT DISTINCT name FROM customer INTERSECT SELECT name FROM salesERROR: get_sortgroupclause_tle: ORDER/GROUP BY expression not found in targetlist
  12. Sorry about that. Our engineers have been very busy finalizing our next release...when that happens, their ability to focus on answering questions for our community, etc. tapers off a bit.
  13. Our products, services, support, and commitment to our community and commercial open source, subscription software will remain unchanged. We will be a product group within Tibco, operating as the same team with the same focus on our customers/community and their needs.
  14. JasperReports Server does not actually ship with OpenSSL. It would be library that is on the actual server shipped within the operating system (not within JasperReports Server). If you have OpenSSL installed on your server, you should be able to do a simple update to it to ensure that you are not vulnerable. The following has some information on how to run the update on various operating systems. https://www.digitalocean.com/community/articles/how-to-protect-your-server-against-the-heartbleed-openssl-vulnerability If your concern is not about the product, but instead about our websites (jaspersoft.com, community.jaspersoft.com, etc), we updated our OpenSSL version very quickly, within the first day of when heartbleed was announced. We do not feel that any user information has been corrupted, however we recommend that you change your passwords just as recommended by most sites base on this issue. Regularly changing your passwords is always a good practice to improve security.
  15. Jaspersoft Studio Tutorial - How to create new connection between Jaspersoft Studio 5.5.0 and Mysql Database, and create simple Report How Add Report File To Netbeans Project :
  16. Jaspersoft Studio is the report tool that run on Eclipse platform instead of NetBeans platform which previously called iReport. Jaspersoft inform that there will be no more update for iReport standalone version therefore Jaspersoft Studio will be the future report tool for opensource developer instead of iReport. More tutorials at http://www.thainetbeans.com/report
  17. While the content of your subreports is small enough to fit on the page, it is likely that the size of the subreport and the repeating/stretching is going beyond the size of the master report page. You might try making the actual subreport or the size of the band in the master report where you are putting the subreport smaller and stretchable for content.
  18. Hi Jennifer, What is the resolution, format, and size of the source image that you are using? And the size of the area within the report where you are putting it? Many times this is just a situation where the source image quality is good enough for displaying on a screen but the resolution is too low for print or being stretched to a larger size in the report, making the resolution lower. Thanks! Matt
  19. Big Value. Broad Usefulness. The time for putting all data to work is now. The cost of collecting and using all data is plummeting. Meanwhile, the tools to access and work with data are becoming simpler. New, big, multi-structured data sets combined with the significant architectural advances of cloud computing has created advantages accessible to any organization. Most profoundly, the next few years will usher in a new class of application, one that uses all available data and is built on the superior economics of cloud-based computing. This class of applications will help make big data small and will be purpose-built to solve an enormous array of real business problems. This is what’s next for Big Data. Cost reduction and improving economics are key ingredients to pervasive use of data. Open source projects are fueling the rise in usefulness of Big Data and creating disruption on a grand scale. Apache Hadoop and Cassandra are some of the best known examples, but represent just tips of the iceberg. A recent IDG survey showed that 70% of enterprises already have or are planning to implement a Big Data project and about 30% of respondents reported using an open source big data framework, such as Hadoop. Our own Big Data survey described a doubling of Big Data projects (versus our same survey last year) as well as a marked reduction (47% decrease) in the confusion about the definition and usefulness of Big Data. I’ve written previously about the amazing advancements in sensor technologies, storage resources and compute cycles – which create a perfect, low cost foundation for developing and delivering new applications that harness huge data sets to new effect. Two of my favorite examples of this new generation of applications in action are Clockwork and TrackX (formerly Fluensee). Clockwork has created a modern, cloud-based Enterprise Asset Management system that is transforming the management of capital-intensive assets for any size of organization. Clockwork recently reported that it has saved its customers more than $3 billion in the past year by combining sensors and sophisticated software to fundamentally improve the maintenance, repair and operations of their assets. TrackX creates software and technology to effectively track and manage physical assets using RFID, barcode, GPS and sensor technologies with its rapidly-deployed, cloud-based software. A key ingredient in TrackX solutions is analytics, because its mission is to provide insight on all asset-related data. TrackX takes all asset attributes and measurements and creates an analytical picture that provides customers with actionable intelligence to better optimize time, labor and asset productivity. This insight is cleverly designed-in its application software, helping anyone in the organization to become more analytically-astute. So, What’s Next for Big Data? The two large growth areas for analytics are: 1. making data scientists far more capable by giving them more useful and cost-effective tools than ever before, including Jaspersoft’s recent innovations enabling rapid blending of big and small data through data virtualization;2. making everyone else in the organization more capably analytical by providing them with just the right amount of analytics at the right place and the right time (no more, no less) within the applications and processes they use every day. As a result, they can make better data-driven decisions than ever before. The second growth area above will have the broadest, most transformational effect on an organization because it can involve everyone, regardless of level or title, making decisions on a daily basis. IDG’s recent survey found that 42% of the respondents plan to invest in these applications in 2014, making this one of the top five areas for investment. This investment level should provide us with great hope, because, although we’re in the very first phase of a revolution in data and analytics, the evidence is now clear. Dramatically lower costs for capturing and using all the data, combined with a new breed of cloud-based analytical applications, will power huge gains in value and usefulness for data in any organization.
  20. Four Trends That Will Shape 2014 From my many travels and conversations in 2013, I've synthesized four primary trends that I believewill make 2014 a transformative year for reporting and analytics. Think of this as my travel log and diary distilled into a short, easily readable series of blog posts. I invite you to follow this short series through my first, second and third installments and now my fourth (and final) below. Your comments and ideas will surely help shape not only my perspective on the future of reporting and analytics, but Jaspersoft's product priorities as well. I look forward to the on-going dialog and I thank our customers and partners for their business and partnership, which mean everything to us. Putting more data to work drives innovation. Innovation can transform processes, products, services and people. Our newfound ability to cost-effectively analyze and find hidden patterns in huge swaths of data will enable a new generation of business-led technology innovation. With this trend, the IT organization must find new ways to integrate and collaborate within the enterprise, becoming an enabler of business-led innovation. This collaboration is more important than ever as technology now defines the new economic battleground for every industry and organization. Even Gartner’s latest predictions abound with a Digital Industrial Revolution theme and watchwords for CIOs and their IT organizations to either lead or get out of the way. It’s a bold new world. All companies are now technology companies. Every organization must put technology to work in ways that create distinction and competitive advantage. Evidence of this trend can be found in any industry-leading company today, where IDC says that business units already control 61% of tech spending. Fortunately, the technological barriers to entry have never been lower. Organizations of all sizes now have affordable access to powerful, enterprise tools, which levels the playing field, allowing even small companies to compete with the big guys (sometimes even more effectively, because of their nimbleness). One example is AirIT, which can help every airport become a technology-enabled data center, driven by metrics relevant to the business, which in turn, streamline operations and save money. Leading enterprises will overtly staff, skill and organize to maximize innovative uses of technology – creating a cascade that will impact education, training and personnel management in all corners of the organization. Even military organizations realize that gaining skill and expertise in data and analytics will remain at the forefront for personal advancement. The risk for all is that a concentration of even deeper technology skills will create digital haves and have-nots within industries, creating a difficult spiral for laggards. Lastly, in order for business-led tech innovation to really flourish, many more knowledge workers (than today) must have access to just the right amount of data and analysis, at the right place and the right time (not too much, not too little), which promises to make everyone into a more capable analyst and decision maker (regardless of job level, title and even skill). In 2014, analytics becomes a thing that you do, not a place that you go and the need for intelligence inside of the applications and business processes we use every day becomes an agent of business-led tech innovation.
  21. 作成:小沢仁 新年おめでとうございます。今年ももっとJasperReportsを使いやすいように改善して行く予定です! 先ずは当たり前のことを当たり前のようにできるようにします。 レスポンスタイムを100ms以内でビッグデータを扱うレポーティング/BIシステムの構築について検討します。 機能が豊富なシステムでも、1操作に20秒以上も掛かっていたら、ユーザは実際にはこの機能を使いません。レポートを出力する時間が私の睡眠時間よりも長いのも問題だと思っています。 幸いにもJasperReportsでビッグデータを高速に扱えることができます。そのためにはソフトウエアデザインとシステム構成が重要になります。 本サイトは「ただ動くものの作りかた」ではなく、ユーザが実際に使えるシステムを構築するノウハウを蓄積していきたいと思います。 ビッグデータを扱いレポート/BI分析をレスポンスタイム1秒以内で行うためには今までとは異なった技術が必要です。システムの構成から初めて、レポートの作り方、データソースなどなどすべての面で従来のレポート/BIツールとはことなった考え方が必要です。 本サイトはこのように一歩進んだ本格的なレポート/BIについての情報を纏めていきます。主にApacheプロジェクトのビッグデータ関係のプロジェクトを中心にします。 キャッシュに依存している設計だと、 if (data > memory) then {you = history} 「データ容量が実メモリの容量よりも多いと、システム設計者は不要の人になってしまう」 基本的な考え方JasperReportsを使ってビッグデータを処理することはできる。問題となるのはJasperReportsの使われ方である。 システム構成(1)RDBは使わない。データベースが性能のネックになっていることが多いです。データが少ない場合であればRDBの方が性能は良いですが、RDBはビッグデータを処理するのには不十分です。 (2)ビッグデータはAPサーバ間で共有しない。ビッグデータを扱う場合は、APサーバをクラスタリングして共有メモリを有効にすると、ビッグデータ用のメモリを共有するためのオーバーヘッドが高くなってしまう。 (3)レポーティング, BIのシステムは別々に考える。 (4)OSSを利用する。ビッグデータを扱う場合はスケールアウトすることがあります。商用製品を使っている場合だと新規にライセンスを購入する必要になります。 レポートの作り方データベースからレポート用に取得するSQL文にCASE, JOIN, UNIONは使わない。CASE, JOIN, UNION処理は単純なSELECT文より遅いです。マスタテーブルの列をすべての行に追加するとデータ量が多くなり、データベースとネットワークに負荷を掛けることになります。列が多くなるとAPサーバのメモリも多く使われるようになるため、同時ユーザ数が増えるとメモリ不足になり、性能が悪化します。 旧ページ>>
  22. From my many travels and conversations in 2013, I've synthesized four primary trends that I believewill make 2014 a transformative year for reporting and analytics. Think of this as my travel log and diary distilled into a short, easily readable series of blog posts. I invite you to follow this short series through my first and second installments and now my third below. Your comments and ideas will surely help shape not only my perspective on the future of reporting and analytics, but Jaspersoft's product priorities as well. I look forward to the on-going dialog and I thank our customers and partners for their business and partnership, which mean everything to us. Trend #3: From Schema-Constrained to Idea-Constrained, The Real Big Data Opportunity In the past (and too often today), we collected just the data that we could afford to store and for which we had a clear, known use. In this sense, we were hard-wired to winnow the data down to its most obvious and practical subset; thus, we were (and are) schema-constrained. By this I mean that today we must know, in advance, the uses for data as they are being captured. This mindset leaves little-to-no room for future, latent value that may exist within a data set. In physics, we recognize that energy has an immediate value (kinetic) and a future value (latent). Why should data be any different? As costs have declined and the power of technology has increased exponentially, we now have the ability to store and use ALL of the data, not just some of the data. But, we may not always know the value of this data while it is being captured. That’s okay. The latent value of data will become more obvious each year and the technology now exists for this to be the norm. In this sense, the real big data opportunity is based on the scale of our ideas to put data to work, finding new correlation and value where it previously was not discernible. Unlocking this new value becomes easier as the world is increasingly digitized; that is, we now regularly put very new data types to work: geo-position / location, sensor updates, click streams, videos and pictures, documents and forms, etc. Just a few years ago, almost none of this would have been considered “data”. Commonly using all these new data types and searching for correlations that can positively impact business will shift the primary constraint to the quality and quantity of our ideas. Perhaps my favorite example of latent data use is the world’s first consumer light field camera from Lytro. The Lytro camera captures and processes the entire light field (11 million rays of light, to be precise). This allows every shot to be filtered and viewed from any perspective, after the shot, allowing the user to later uncover what the best focus, angle, and zoom, might yield. Lytro refers to the result as a “living picture” with a 3-dimensional feel. What a beautiful use of big, latent data. In 2014, we’ll move from being schema-constrained to idea-constrained, more often finding the real value in Big Data.
  23. In this tech talk we will demonstrate the process for upgrading JasperReports Server. This will include a demonstration using both the WAR file installer and the overlay installer packages. For more Jaspersoft Tech Talks, see: http://www.jaspersoft.com/tech-talks
  24. In this tech talk we will demonstrate the process for upgrading JasperReports Server. This will include a demonstration using both the WAR file installer and the overlay installer packages. For more Jaspersoft Tech Talks, see: http://www.jaspersoft.com/tech-talks
  25. From my many travels and conversations in 2013, I've synthesized four primary trends that I believewill make 2014 a transformative year for reporting and analytics. Think of this as my travel log and diary distilled into a short, easily readable series of blog posts. I invite you to follow this short series with my first installment here and my second below. Your comments and ideas will surely help shape not only my perspective on the future of reporting and analytics, but Jaspersoft's business intelligence product priorities as well. I look forward to the on-going dialog and I thank our customers and partners for their business and partnership, which mean everything to us. As cloud-originated data growth accelerates, the attractiveness of using additional cloud-based data management and analysis services will grow too. I’ve previously written about an entirely new generation of platform and middleware software that will emerge to satisfy the new set of cloud-based, elastic compute needs within organizations of all sizes. Commonly at this software layer, utility-based pricing and scalable provisioning models will be chosen to more strictly match consumption patternswith fees paid. These improved economics, in turn, will enable broader use of platform and middleware software, especially reporting and analytics, than ever before possible. Additionally, cloud-based delivery portends a level of simplicity and ease-of-use (consumer-like services) that defies the earlier generation of enterprise software, ushering in deeper consumption of analytics by organizations of all sizes. In short, cloud-based delivery becomes a key component of the quest for pervasive analytics – especially when those analytics are delivered as components within the web applications we use every day. According to Nucleus Research: “As companies learn to take full advantage of the analytics functionalities that are now available with utility and subscription-based pricing options, they will continue to become more able to take advantage of market trends and opportunities before their peers and take advantage of the average return of $10.66 for every dollar spent in analytics.” In 2014, cloud-based delivery will change the analytics consumption pattern.
×
×
  • Create New...