Jump to content
We've recently updated our Privacy Statement, available here ×

mgeise

Members
  • Posts

    584
  • Joined

  • Last visited

 Content Type 

Profiles

Forum

Events

Featured Visualizations

Knowledge Base

Documentation (PDF Downloads)

Blog

Documentation (Test Area)

Documentation

Dr. Jaspersoft Webinar Series

Downloads

Everything posted by mgeise

  1. In this Jaspersoft Tech Talk episode, we cover some best practices from web and user interface design and how they apply to report design. You will learn about how to use or not use design elements such as: * Boarders, Backgrounds, Shadows, 3-D Styles * Space * Typography: Fonts, size weight, and style * Colors * Alignment To view our full list of Jaspersoft Tech Talks, please visit http://www.jaspersoft.com/tech-talks
  2. In this Jaspersoft Tech Talk episode, we cover some best practices from web and user interface design and how they apply to report design. You will learn about how to use or not use design elements such as: * Boarders, Backgrounds, Shadows, 3-D Styles * Space * Typography: Fonts, size weight, and style * Colors * Alignment To view our full list of Jaspersoft Tech Talks, please visit http://www.jaspersoft.com/tech-talks
  3. mgeise

    Sponsors

    Jaspersoft would like to acknowledge the following sponsors of our open source projects. Their contributions to our open source projects and community eco-system enable us to better provide our open source BI options to our developer community: Sponsor Description JFrog JFrog, through Artifactory, hosts Open Source projects and provides their users and committers with a binary repository manager in the cloud, as a platform for publishing and delivering their release artifacts. See the Jaspersoft Binary repository at http://jaspersoft.artifactoryonline.com.SourceForge SourceForge is dedicated to making Open Source projects successful. SourceForge thrives on community collaboration to help create the leading resource for open source software development and distribution. With the tools SourceForge provides, 3.4 million developers create powerful software in over 324,000 projects. Their popular directory connects more than 46 million consumers with these open source projects and serves more than 4,000,000 downloads a day. SourceForge is where open source happens.
  4. It becomes increasingly interesting to send periodic reports or to conduct ad hoc analysis on the data as your application matures. The choice of reporting and analytics tools for relational databases is large, the choice of tools for MongoDB and other NoSQL data sources is quite limited. The session covers some example reporting and analytics using the aggregation framework and explores 3 distinct analysis options (direct, indirect and batch). You'll walk away with a deepr understanding of software that you can use stand alone or embedded into other applications that allows you to turn your MongoDB data into valuable and actionable information. To see the Q&A details associated with this session, please visit: http://www.jaspersoft.com/tech-talk-mondodb-reporting-and-analytics For past and future Jaspersoft Tech Talks, check out the schedule and archive at: http://www.jaspersoft.com/tech-talks
  5. It becomes increasingly interesting to send periodic reports or to conduct ad hoc analysis on the data as your application matures. The choice of reporting and analytics tools for relational databases is large, the choice of tools for MongoDB and other NoSQL data sources is quite limited. The session covers some example reporting and analytics using the aggregation framework and explores 3 distinct analysis options (direct, indirect and batch). You'll walk away with a deepr understanding of software that you can use stand alone or embedded into other applications that allows you to turn your MongoDB data into valuable and actionable information. To see the Q&A details associated with this session, please visit: http://www.jaspersoft.com/tech-talk-mondodb-reporting-and-analytics For past and future Jaspersoft Tech Talks, check out the schedule and archive at: http://www.jaspersoft.com/tech-talks
  6. Jaspersoft ships the JasperReports Server product in two ways: a "bundled" installer with Apache Tomcat and Postgres as well as an unbundled WAR file where you are responsible for providing your own Application Server and DBMS This episode focused on the "unbundled" version and guides you through a typical Linux installation, here's what we went through: * Planning your installation, how to decipher the Platform Support Datasheet including versions of JVM, App Servera nd DBMS * Using the buildomatic scripts to deploy Jaspersoft * JVM Memory adjustments * Installation best practices, including backups, locations, etc To see the Q&A details associated with this session, please visit: http://www.jaspersoft.com/tech-talk-linux For past and future Jaspersoft Tech Talks, check out the schedule and archive at: http://www.jaspersoft.com/tech-talks
  7. Jaspersoft ships the JasperReports Server product in two ways: a "bundled" installer with Apache Tomcat and Postgres as well as an unbundled WAR file where you are responsible for providing your own Application Server and DBMS This episode focused on the "unbundled" version and guides you through a typical Linux installation, here's what we went through: * Planning your installation, how to decipher the Platform Support Datasheet including versions of JVM, App Servera nd DBMS * Using the buildomatic scripts to deploy Jaspersoft * JVM Memory adjustments * Installation best practices, including backups, locations, etc To see the Q&A details associated with this session, please visit: http://www.jaspersoft.com/tech-talk-linux For past and future Jaspersoft Tech Talks, check out the schedule and archive at: http://www.jaspersoft.com/tech-talks
  8. The server will stop working as soon as the license expires. I recommend working with your account manager to get a new license key.
  9. Unfortunately the answer is no. The HTML 5 Charting is not included in the community edition.
  10. A tutorial on using subreports in Jaspersoft's iReport Designer; example runs several queries each mapping to a different subreport
  11. A tutorial on using templates in Jaspersoft's iReport Designer; example creates a report using the wizard and a template created in the video
  12. A tutorial showing how to add grand totals to a report created with Jasper's iReport Designer
  13. Last month, Jaspersoft announced the industry’s first completely pay-as-you-go reporting and analytic service on Amazon’s AWS Marketplace. With this service, you can literally be up-and-running (analyzing your data) in less than 10 minutes and pay as littleas 52 cents per hour to do so. And, as we’ve just announced, Amazon and Jaspersoft added more than 100 customers during the first month of availability – a great start to a new service destined to change the way BI is consumed for many purposes. One of my favorite University professors recently asked me what worries me the most about being on the cutting edge with Amazon and this new service. My response: NOT being on the cutting edge with Amazon and this new service. In other words, I would worry most about not innovating in this way. Disrupting through both our business model and product innovation is a critical part of our culture at Jaspersoft. In fact, the early success of our new Amazon-hosted service reminded me of two fast-emerging, inter-related cloud computing concepts that, though not discussed sufficiently, will have a substantial impact on the future usage and adoption of cloud-based computing services. These two concepts are: cloud-originated data and the post-transactional cloud *1. I maintain that, as the former quickly grows, the latter becomes commonplace. Cloud-Originated DataWhile the total digital universe currently weighs in at nearly 3 zettabytes, it is estimated that more than one Exabyte of that data is stored in the cloud. Each day, the growth rate of cloud-originated data increases, because of the explosion in services and applications that rely on the cloud as infrastructure. So, a disproportionate amount of the 13X growth projected in the digital universe between now and 2020 will come from cloud-originated data. IDC estimates that by 2020, nearly 40% of all information will be “touched” by cloud computing (somewhere from origination to disposal). Eventually, most of the digital universe will be cloud-based. The growth in Amazon’s Simple Storage Service (S3) provides another compelling data point for the growth of cloud-originated data. In the past several years, Amazon’s S3 service has seen meteoric growth, now storing nearly one trillion objects (growing by 1 billion objects per day) and handling more than 650,000 requests per second (for those objects). The chart below illustrates this dramatic growth *2. http://4.bp.blogspot.com/-SA4EeyMbhoo/UWNlkwOOqwI/AAAAAAAAAHI/B7wcYwDKpoM/s320/Amazon+S3_Growth_2012_Q1.png Importantly, cloud-originated data is more easily liberated (post-transaction) by other cloud services, which can unlock additional value easily and affordably. According to a recent report by Nucleus Research, companies that more quickly utilize cloud-based analytics are likely to gain a competitive advantage: “As companies learn to take full advantage of the analytics functionalities that are now available with utility and subscription-based pricing options, they will continue to become more able to take advantage of market trends and opportunities before their peers and take advantage of the average return of $10.66 for every dollar spent in analytics.” Ultimately, analytics is just one of many important post-transactional uses of cloud-baseddata, which will surely be the subject of future posts. Post-Transactional CloudMy working definition of the post-transactional cloud is “the next-generation of cloud services, beyond Software-as-a-Service (SaaS), designed to enable platform and middleware tools to use cloud-originated transactional data and deliver a richer, more sophisticated computing experience.” The concept of a post-transactional cloud provides a powerful analog that mirrors the history of the on-premises computing world. Let me elaborate. The ERP/CRM/Supply Chain application boom of the ‘80s and ‘90s preceded an enormous need in the ‘90s and ‘00s for additional tools and software systems designed specifically to create even greater value from the data generated by these (on-premises) transactional applications. Then, tools for data management, data integration, data warehousing and business intelligence (reporting and analytics) were born to deliver this new value. Cloud computing has grown substantially in the last 10 years largely because of applications hosted in the cloud and made available as a service directly to consumers and businesses. The posterchild here is Salesforce.com (although there are thousands of others). Note that we call this category “Software-as-a-Service” when it really should be called “Application-as-a-Service” because theproviders in this category are delivering a transactional, process-oriented application designed to automate and improve some functional aspect of an organization. As the use of these managed services/applications grows, so too does the quotient of cloud-originated data generated by these applications. The dramatic rise in cloud-originated data from SaaS applications portends a similar need: this one for post-transactional cloud-based tools and software systems to define a new usage curve for liberating cloud-based data and creating substantially new organizational value. It’s just a matter of time. Which makes Jaspersoft’s work with Amazon clear and understandable. In fact, Jaspersoft’s cloud-based service (across all major Platform-as-a-Service environments, such as VMWare’s CloudFoundry and Red Hat’s OpenShift, but right now, especially with Amazon’s AWS) helps ensure our tools are the de facto standard for reporting and analysis on cloud-originated data (in the post-transactional cloud). We’ll do this in two ways:1. By bringing our BI service to customers who already prefer to use cloud services, and by being available in their preferred cloud instead of forcing them into our cloud; and 2. By enabling elegant, affordable, embeddable reporting and analysis within cloud-based applications, so those who deliver this software can include intelligence inside their transactional applications. At Jaspersoft, we ultimately see our cloud-based service as vital to reaching the broadest possible audience with just the right amount of reporting and analytics (not too much, not too little). The post-transactional cloud will be fueled by cloud-originated data and the need to deliver cleverly-designed intelligence inside this environment will be more important than ever. BrianGentileCEO, Jaspersoft 1 I’ve borrowed the term “Post-Transactional Cloud” from ZDNet’s Andrew Brust, in his article entitled “Amazon Announces ‘Redshift” cloud data warehouse, with Jaspersoft support”.2 Data and chart excerpted from TechCrunch article “Amazon S3: 905 Billion Objects Stored, 1 Billion Added Each Day”, Sarah Perez, April 6, 2012.
  14. Last month, Jaspersoft announced the industry’s first completely pay-as-you-go reporting and analytic service on Amazon’s AWS Marketplace. With this service, you can literally be up-and-running (analyzing your data) in less than 10 minutes and pay as littleas 52 cents per hour to do so. And, as we’ve just announced, Amazon and Jaspersoft added more than 100 customers during the first month of availability – a great start to a new service destined to change the way BI is consumed for many purposes. One of my favorite University professors recently asked me what worries me the most about being on the cutting edge with Amazon and this new service. My response: NOT being on the cutting edge with Amazon and this new service. In other words, I would worry most about not innovating in this way. Disrupting through both our business model and product innovation is a critical part of our culture at Jaspersoft. In fact, the early success of our new Amazon-hosted service reminded me of two fast-emerging, inter-related cloud computing concepts that, though not discussed sufficiently, will have a substantial impact on the future usage and adoption of cloud-based computing services. These two concepts are: cloud-originated data and the post-transactional cloud *1. I maintain that, as the former quickly grows, the latter becomes commonplace. Cloud-Originated DataWhile the total digital universe currently weighs in at nearly 3 zettabytes, it is estimated that more than one Exabyte of that data is stored in the cloud. Each day, the growth rate of cloud-originated data increases, because of the explosion in services and applications that rely on the cloud as infrastructure. So, a disproportionate amount of the 13X growth projected in the digital universe between now and 2020 will come from cloud-originated data. IDC estimates that by 2020, nearly 40% of all information will be “touched” by cloud computing (somewhere from origination to disposal). Eventually, most of the digital universe will be cloud-based. The growth in Amazon’s Simple Storage Service (S3) provides another compelling data point for the growth of cloud-originated data. In the past several years, Amazon’s S3 service has seen meteoric growth, now storing nearly one trillion objects (growing by 1 billion objects per day) and handling more than 650,000 requests per second (for those objects). The chart below illustrates this dramatic growth *2. http://4.bp.blogspot.com/-SA4EeyMbhoo/UWNlkwOOqwI/AAAAAAAAAHI/B7wcYwDKpoM/s320/Amazon+S3_Growth_2012_Q1.png Importantly, cloud-originated data is more easily liberated (post-transaction) by other cloud services, which can unlock additional value easily and affordably. According to a recent report by Nucleus Research, companies that more quickly utilize cloud-based analytics are likely to gain a competitive advantage: “As companies learn to take full advantage of the analytics functionalities that are now available with utility and subscription-based pricing options, they will continue to become more able to take advantage of market trends and opportunities before their peers and take advantage of the average return of $10.66 for every dollar spent in analytics.” Ultimately, analytics is just one of many important post-transactional uses of cloud-baseddata, which will surely be the subject of future posts. Post-Transactional CloudMy working definition of the post-transactional cloud is “the next-generation of cloud services, beyond Software-as-a-Service (SaaS), designed to enable platform and middleware tools to use cloud-originated transactional data and deliver a richer, more sophisticated computing experience.” The concept of a post-transactional cloud provides a powerful analog that mirrors the history of the on-premises computing world. Let me elaborate. The ERP/CRM/Supply Chain application boom of the ‘80s and ‘90s preceded an enormous need in the ‘90s and ‘00s for additional tools and software systems designed specifically to create even greater value from the data generated by these (on-premises) transactional applications. Then, tools for data management, data integration, data warehousing and business intelligence (reporting and analytics) were born to deliver this new value. Cloud computing has grown substantially in the last 10 years largely because of applications hosted in the cloud and made available as a service directly to consumers and businesses. The posterchild here is Salesforce.com (although there are thousands of others). Note that we call this category “Software-as-a-Service” when it really should be called “Application-as-a-Service” because theproviders in this category are delivering a transactional, process-oriented application designed to automate and improve some functional aspect of an organization. As the use of these managed services/applications grows, so too does the quotient of cloud-originated data generated by these applications. The dramatic rise in cloud-originated data from SaaS applications portends a similar need: this one for post-transactional cloud-based tools and software systems to define a new usage curve for liberating cloud-based data and creating substantially new organizational value. It’s just a matter of time. Which makes Jaspersoft’s work with Amazon clear and understandable. In fact, Jaspersoft’s cloud-based service (across all major Platform-as-a-Service environments, such as VMWare’s CloudFoundry and Red Hat’s OpenShift, but right now, especially with Amazon’s AWS) helps ensure our tools are the de facto standard for reporting and analysis on cloud-originated data (in the post-transactional cloud). We’ll do this in two ways:1. By bringing our BI service to customers who already prefer to use cloud services, and by being available in their preferred cloud instead of forcing them into our cloud; and 2. By enabling elegant, affordable, embeddable reporting and analysis within cloud-based applications, so those who deliver this software can include intelligence inside their transactional applications. At Jaspersoft, we ultimately see our cloud-based service as vital to reaching the broadest possible audience with just the right amount of reporting and analytics (not too much, not too little). The post-transactional cloud will be fueled by cloud-originated data and the need to deliver cleverly-designed intelligence inside this environment will be more important than ever. BrianGentileCEO, Jaspersoft 1 I’ve borrowed the term “Post-Transactional Cloud” from ZDNet’s Andrew Brust, in his article entitled “Amazon Announces ‘Redshift” cloud data warehouse, with Jaspersoft support”.2 Data and chart excerpted from TechCrunch article “Amazon S3: 905 Billion Objects Stored, 1 Billion Added Each Day”, Sarah Perez, April 6, 2012.
  15. So you've purchased Jaspersoft and you want to allow your end users to create analytical views through one of our metadata layers: Domains or OLAP. Which one is right for you? Get the complete picture on our 2 technologies. Find out which one is right for you by looking at these dimensions: * Volume - Size of data * Complexity of queries + calculations * Row level security requirements * Velocity - How quickly do users expect data to be returned? * Velocity - How quickly does the data change underneath? * Variety - Where does the data come from? To see the Q&A details associated with this session, please visit: http://www.jaspersoft.com/tech-talk-an-analysis-of-analysis-a-domain-and-olap-shootout For past and future Jaspersoft Tech Talks, check out the schedule and archive at: http://www.jaspersoft.com/tech-talks
  16. So you've purchased Jaspersoft and you want to allow your end users to create analytical views through one of our metadata layers: Domains or OLAP. Which one is right for you? Get the complete picture on our 2 technologies. Find out which one is right for you by looking at these dimensions: * Volume - Size of data * Complexity of queries + calculations * Row level security requirements * Velocity - How quickly do users expect data to be returned? * Velocity - How quickly does the data change underneath? * Variety - Where does the data come from? To see the Q&A details associated with this session, please visit: http://www.jaspersoft.com/tech-talk-an-analysis-of-analysis-a-domain-and-olap-shootout For past and future Jaspersoft Tech Talks, check out the schedule and archive at: http://www.jaspersoft.com/tech-talks
  17. This is a function of how you send the PDF to the printer rather than how the PDF itself is generated. There is not a function within JasperReports to manage this - this would be a function of your print driver, etc. I hope this helps.
  18. Jaspersoft ETL comes bundles with many Jaspersoft editions, are you making use of your investment or are you still writing and managing your ETL scripts by hand? In this session you'll learn: * ETL architecture overview: designer, administration, job execution * Installing custom components * What's coming in new versions * Q&A session To see the Q&A details associated with this session, please visit: http://www.jaspersoft.com/tech-talk-jaspersoftetl-architecture For past and future Jaspersoft Tech Talks, check out the schedule and archive at: http://www.jaspersoft.com/tech-talks
  19. Jaspersoft ETL comes bundles with many Jaspersoft editions, are you making use of your investment or are you still writing and managing your ETL scripts by hand? In this session you'll learn: * ETL architecture overview: designer, administration, job execution * Installing custom components * What's coming in new versions * Q&A session To see the Q&A details associated with this session, please visit: http://www.jaspersoft.com/tech-talk-jaspersoftetl-architecture For past and future Jaspersoft Tech Talks, check out the schedule and archive at: http://www.jaspersoft.com/tech-talks
  20. Learn about integrating "the intelligence inside" of your corporate or customer portal using Jaspersoft. The focus will be around Java portal technologies but the concepts can be applied to other technologies as well. We'll cover the following topics: * Jaspersoft integration options with portals * Liferay Portlet Integration, including a configuration example * iFrame Integration, including an example * JSR168 Portlets (into other portals) * Discuss Single Sign On with CAS To see the Q&A details associated with this session, please visit: http://www.jaspersoft.com/tech-talk-integrating-jaspersoft-with-portals For past and future Jaspersoft Tech Talks, check out the schedule and archive at: http://www.jaspersoft.com/tech-talks
  21. Learn about integrating "the intelligence inside" of your corporate or customer portal using Jaspersoft. The focus will be around Java portal technologies but the concepts can be applied to other technologies as well. We'll cover the following topics: * Jaspersoft integration options with portals * Liferay Portlet Integration, including a configuration example * iFrame Integration, including an example * JSR168 Portlets (into other portals) * Discuss Single Sign On with CAS To see the Q&A details associated with this session, please visit: http://www.jaspersoft.com/tech-talk-integrating-jaspersoft-with-portals For past and future Jaspersoft Tech Talks, check out the schedule and archive at: http://www.jaspersoft.com/tech-talks
  22. Dashboards are among the most frequently requested BI deliverable because they're easy to understand and can drive the performance of an organization. This session will provide some useful tips and tricks for building the most effective dashboards. To see the Q&A details associated with this session, please visit: http://www.jaspersoft.com/dashboard-design-tips-and-tricks The slides can be accessed here: http://www.slideshare.net/jaspersoft/dashboard-tipsandtricks For past and future Jaspersoft Tech Talks, check out the schedule and archive at: http://www.jaspersoft.com/tech-talks
  23. Dashboards are among the most frequently requested BI deliverable because they're easy to understand and can drive the performance of an organization. This session will provide some useful tips and tricks for building the most effective dashboards. To see the Q&A details associated with this session, please visit: http://www.jaspersoft.com/dashboard-design-tips-and-tricks The slides can be accessed here: http://www.slideshare.net/jaspersoft/dashboard-tipsandtricks For past and future Jaspersoft Tech Talks, check out the schedule and archive at: http://www.jaspersoft.com/tech-talks
  24. DB2 as a datasource will not likely have any issues in the community edition. We don't test it, but there are plenty of community members that have their data in DB2 and are able to connect and generate reports from it. At the end of the day, if you can connect through a JDBC driver and it is SQL92/JDBC2.1 compliant, there should not be an issue.
×
×
  • Create New...