Filter by tags:
December 13, 2010
But, now the time has finally come. We’ve reached the point where adding an annual event to these local meet-ups is required. Our thousands of commercial customers (across more than 100 countries), nearly 1,000 software subscription customers, and more than 180,000 community members require and deserve the next step: JasperWorld.
I’m thrilled to announce that Howard Dresner will provide a keynote address on February 8 and Marten Mickos will do so on February 9. Howard is the well-known business intelligence analyst and industry expert who spent so many years at Gartner, then did an executive stint at Hyperion (until it was acquired by Oracle) and now is the principal of Dresner Advisory Services. Marten is the former CEO of MySQL, the current CEO of Eucalyptus Systems and has been a good friend to Jaspersoft for some time. I’m eager to share the stage with these gentlemen during JasperWorld.
We’re planning a wide variety of technical sessions that enable our partners, customers and community to learn, share and advance their BI agenda – all focused on Jaspersoft products and technologies. Of course, our technical founders (Teodor Danciu – JasperReports, and Giulio Toffoli – iReport) and many of our key technical staff members will attend, lead sessions and be available to meet with customers and community members. And, we have some great partners who will help sponsor this event, enriching the value for everyone. Lastly, we have some fun planned (of course) – Jaspersoft style. You’ll have to attend to learn more!
I hope to see you at JasperWorld 2011.
Chief Executive Officer
December 13, 2010
October 8, 2010
Leading the New Reporting Market
( n/a )
Yesterday, Jaspersoft announced JasperReports Server Professional Edition, our first commercial offering built precisely for those who require sophisticated reports, designed and developed professionally – and scheduled and delivered interactively.
This new offering is architected to match the substantial rise in interest for affordable reporting for both stand-alone and embedded uses - in organizations of all sizes. Indeed, by engaging our community, and listening to their feedback, we designed a reporting server that could address their need for interactive reporting with security and scheduling, while providing the assurance of a commercial subscription.
In addition to a robust feature set (for this reporting customer type), I am proud that we’re making this professional edition product available at a fraction of the price of any traditional competitor – delivering the most affordable and powerful reporting server available in the world today.
JasperReports Server is our recommended product for organizations requiring an affordable reporting solution for interactive, operational, and production-based reporting. Deployed as a standalone reporting server or integrated inside another application, JasperReports Server is a flexible, powerful, interactive reporting environment for small or large enterprises. And, it’s powered by the world’s most popular reporting tools: JasperReports and iReport. Now, developers and users can take advantage of more interactivity, security, and scheduling of their reports with a remarkably cost-effective offering.
I expect this to be the perfect complement to those who’ve been using JasperReports (and iReport) and so we’ve announced JasperReports Server Professional with a special introductory price and we’re making it very easy to learn more about this product. Here are some resources to do so:
Product summary information, including a brief demonstration
Because Jaspersoft is so well-known for reporting, I’m watching the feedback and questions about this new product very closely. If you have comments or questions, I’d be eager to know.
Chief Executive Officer
October 8, 2010
September 23, 2010
I believe that, over time, Jaspersoft’s distinction will be less about it being an open source software company and more about its abilities as a great business intelligence software company. I expect declining distinction for our open source-ness will partly occur because the success of open source software and the benefit it brings the community and customers become better accepted and understood each year (and, therefore, less unique). I also believe that the most valuable aspect of the open source model will long endure, way after the sheen fades from the download, forum post or roadmap voting. That is, the principles of open source software are its most distinguishing characteristic and will eventually reach not just all technology companies, but all other industries as well.
Doing the right thing when no one is watching may be the best definition of integrity. You combine that with frankness and honesty and you have the first open source principle, Transparency. With open source software, anyone can watch. Jaspersoft software engineers and our community contributors know that every line of code they write will be made available for inspection and comment by a very large community. If they have any discomfort with transparency, they would choose a different vocation.
Actively giving back in a very tangible way is the heart of participation. Making the open source projects, of which each community member is part, more successful and more capable should be the common goal. Giving back can mean many things, including and especially either committing time through code contributions (for those community members with the skill and expertise) or purchasing / licensing the software if the project is in any way commercial open source. Code contributions can include not just feature advancements, but language translations, bug fixes, and quality assurance testing assistance, among others.
Open source community distinction emerges because its members participate by using either their time (i.e., skill) or their money. Either is valuable and helps to make the open source project thrive. The only sin in open source is not participating. In other words, if a community member is using open source software and deriving real benefit from its existence, then participating by providing time or money should be seen as basic and reasonable reciprocity.
Collaboration is about collective engagement for the common good and is the fastest route to open source project success. If an open source project is a neighborhood, then collaboration is the barn raising. Distinguishing this from “participation”, collaboration is about helping others in the community because doing so advances the project and its usefulness for everyone.
My favorite example of collaboration is knowledge sharing through forums, blogs and idea exchanges (in some circles, called ideagoras). On JasperForge, Jaspersoft’s open source community web site, there are more than 160,000 registered members who have collectively offered nearly 80,000 forum entries across all the listed top-level projects. The variety of questions and issues being addressed by and for community members within the forums is staggering. And, the vibrancy that emerges through this exchange of skill is core to large-scale community success.
While forum activity remains brisk, I’m equally proud of our guided use of an idea exchange within JasperForge. Each top-level project includes a roadmap where community members can comment and vote on planned features. This not only allows many voices to be heard, but provides a valuable calibration for Jaspersoft and its community, ultimately yielding the most important product features and advancements in approximately the best priority order.
There are many more examples of collaboration in action, across JasperForge and other leading open source sites, but these are some of my favorites.
I talk about these three principles of open source regularly, and I’m fond of concluding that the real benefit of collaboration accrues to those who participate transparently. That’s just my clever way of mentioning all three the open source principles in one actionable sentence. What are your favorite examples of these open source principles in action? Your thoughts and comments are always welcome.
Chief Executive Officer
September 23, 2010
August 9, 2010
For this blog post, I’ll describe which technologies will likely fuel these changing usage patterns and some product categories that will, therefore, get a boost.
These are data stores that use sophisticated indexing, compression, columnar, and/or other technologies to deliver fast querying for large data sets. Increasingly, newer entrants in this category are less expensive that their enterprise data warehouse and OLTP counterparts. Although natively these databases require structured data formats, they provide a tremendous new capability to deal with large data volumes affordably and with greater processing power. When combined with a sophisticated analytic tool (such as a ROLAP engine or in-memory analysis techniques), an analytic database can deliver speed, volume, and sophisticated multi-dimensional insight – a powerful combination. For more on this product category, check out this prior post.
Distributed Data Processing via Hadoop
Large volumes of distributed data, typically generated through web activity and transactions, is the fastest growing data type. This data is commonly unstructured, semi-structured or complex, and holds great promise for delivering keen business insight if tapped properly. With the open source project Hadoop, and some upstart open source companies working to commercialize it, that previously untapped information capital is now ready to be unlocked. By enabling massive sets of complex data to be manipulated in parallel processes, Hadoop provides businesses a powerful new tool to perform “big data” analysis to find trends and act on data previously out-of-reach. Increasingly, big data will be a big deal and this is an important area to watch.
Complex Event Processing
On their own, large data volumes already create difficult analytic challenges. When that data is being created and updated rapidly (even imperceptibly to humans), a different approach to analysis is required. CEP tools monitor streaming data looking for events to help identify otherwise imperceptible patterns. I’ve referred to this technological concept elsewhere as the converse of traditional ad hoc analysis where the data persists and the queries are dynamic. With CEP, in a sense, the query persists and the data is dynamic. You can expect CEP-based, dynamic data analysis functionality to become more interesting and capable across a wider variety of uses each year.
More simple, integrated, multi-dimensional views of data should not be available only to those who spent two weeks in a special class (think ROLAP or MOLAP). They should exist alongside your favorite bar or line chart and tabular view of data. The analysis should also be constructed for you by the server, persist in memory as long as you need it (and no longer), and then get out of your way when finished. Interacting with it should be as straightforward as navigating a hyperlink report and pivot table -- although a variety of cross-tab types, charts, maps, gauges and widgets should be available for you to do so.
Ever since IBM acquired SPSS, statistical modeling is cool again (since when is IBM cool, btw?). The truth is that the natural progression when analyzing past data is to project it forward. With the need to deal with larger volumes of data and at lower latency, it stands to reason that predicting future results becomes more important. This is why I believe the R revolution is here to stay (R is the open source statistical analysis tool used by many in the academic and scientific world). I predict a growing commercial need for this open source juggernaut, and by this I mean a growing demand for tools based on R with more robust features and a commercial business model – and a few software companies are delivering.
If you follow the Open Book on BI, you know I’m a big fan of mash-up dashboards. I expect these flexible, web-based constructs to deliver the most pervasive set of contextually relevant data, gaining broader use and enabling better decisions even without fancy predictive tools (although the output from a statistical model should be embeddable within a mashboard, maintaining its link back to the model and data source along with any relevant filters). Earlier this year, I wrote an article about making better, faster decisions through the clever use of mashboards. Making those good decisions is about understanding the past and recognizing current patterns, all while understanding the proper context. These relevant visual data elements should come together in a single, navigable view. Perfect for a mashboard.
So, this is my short list of business intelligence product categories and technologies that stand to gain substantially in the next few years. Surely I’ve not covered them all so your comments and feedback are encouraged.
Chief Executive Officer
August 9, 2010
Planet Jaspersoft aggregates blog posts from our community. If you would like your blog to be included in the Planet, please follow this help guide. Or just click this link to go straight to your Planet Feeds.
- JasperSoft BI Suite Tutorials - Sadakar Pochampalli (273) Apply JasperSoft BI Suite Tutorials - Sadakar Pochampalli filter
- Technology Blog (50) Apply Technology Blog filter
- Jaspersoft Tech Talks (41) Apply Jaspersoft Tech Talks filter
- Jaspersoft Tutorials (30) Apply Jaspersoft Tutorials filter
- The Open Book on BI (26) Apply The Open Book on BI filter
- Rajesh Sirsikar (22) Apply Rajesh Sirsikar filter
- Bekwam Data as a Service (21) Apply Bekwam Data as a Service filter
- AgileTech - Ankur Gupta (19) Apply AgileTech - Ankur Gupta filter
- Ankur Gupta - Youtube (17) Apply Ankur Gupta - Youtube filter
- Digital Gene (15) Apply Digital Gene filter
- Tech Poet (13) Apply Tech Poet filter
- iTransparent - Jaspersoft Blog (9) Apply iTransparent - Jaspersoft Blog filter
- Paco Saucedo's blog » JasperReports (6) Apply Paco Saucedo's blog » JasperReports filter
- David Hoppmann's JasperServer Posts (2) Apply David Hoppmann's JasperServer Posts filter
- Jasper Related Posts from Wedjaa (1) Apply Jasper Related Posts from Wedjaa filter