Planet Jaspersoft

April 1, 2015

TIBCO Analytics - One Year Later

noreply@blogger.com (Brian Gentile)
The Open Book on BI )

The past twelve months have been a wild ride in the world of business intelligence and big data, full of acquisitions (including Jaspersoft’s addition to the TIBCO Software family) and innovations. Since forming TIBCO Analytics, we’ve entered an extraordinarily important time for the business and all those customers who’ve placed their trust in us as their partner for data analysis and reporting.   Because of this trust placed in us, and the innovative tools at our disposal as part of TIBCO Software, we have a unique opportunity to deliver best-in-class value with each of our tools.

TIBCO Analytics brings together two market-leading platforms:  TIBCO Spotfire, the award-winning platform for data discovery, analysis and visualization, and TIBCO Jaspersoft, the award-winning platform for embeddable reporting and analysis.  These two platforms solve very different analytics problems and for different types of customers.  Even the very fundamental aspects of these two platforms and the uses for which they are best suited are complementary, which becomes a growing point of emphasis as we aim to deliver increased value for our customers with each product release.

Through the combination of these products and formation of a business unit focused exclusively on Analytics, we have become one of the largest business intelligence tool providers in the industry, and boast one of the largest customer bases as well.  Our significant level of investment in Analytics allows us to solve some of the most important, new analytics problems facing our customers.

We’ll solve these problems through the continued improvement and innovation of both primary platforms (Spotfire and Jaspersoft), and we’ll do so in ways that leverage the strengths of one platform to help the other become more powerful in solving the needs of its core customer.  In other words, we’ll use Jaspersoft-centric features and techniques to help make Spotfire an even more capable data discovery and visualization tool, thereby affordably reaching the broadest possible audience of users, driving greater value in a customer’s investment.

In addition, we’ll use Spotfire-centric features and techniques to help make Jaspersoft an even more capable embedded reporting and analysis tool, enabling more powerful and visual use of data, which also drives greater value in a customer’s investment.

Our longer-term strategy for our Analytics platforms centers on cloud architecture and cloud-based delivery.  Our aim is to literally re-imagine business analytics – the way it is designed, developed, delivered, deployed and consumed – to create a seamless spectrum of web-based analytic services, which can enable anyone to have a personalized or tailored experience that is perfectly fit-for-purpose. It’s an exciting vision and, when we discuss this directly and in more detail with our customers, it never fails to capture their excitement as well.

I look forward to this TIBCO Analytics team describing this vision in more detail for you and then gaining your feedback and engagement so you might join us on this amazing step in the journey toward next-generation data analytics.


April 1, 2015

January 5, 2015

2014 was an eventful year for Jaspersoft. We launched two new versions of our flagship product, JasperReports Server: v5.6 and v6, both full of clever innovations. We reached new records for total number of customers, product downloads, and registered community members. Oh yeah, and we were acquired by TIBCO Software at the end of April.

Now that Jaspersoft is part of the TIBCO Analytics family of products, a few things change, but most things stay the same. I’m excited that Jaspersoft is now part of a bigger company with complementary Analytics tools, which will allow it to solve more customer problems in more areas of the globe. I’m also thankful that we’ll continue all that has made Jaspersoft's business intelligence solitions great: focus on innovation, commitment to solving embeddable analytics projects, being a great provider of open source software and member of the Open Source community, and always putting the interests of customers and community first.

What’s Changed, Now That Jaspersoft is Part of TIBCO?
Jaspersoft products are now part of a bigger, more comprehensive set of Analytics offerings which will allow it to ultimately solve a broader set of a customer’s analytics needs.  Within TIBCO Analytics, the Jaspersoft product line now sits next to TIBCO Spotfire, the market-leading data discovery, analysis and visualization tool that has been part of TIBCO since 2007.  These two tools (Jaspersoft and Spotfire) are extremely complementary, so you can expect, over time, one tool to help the other be even stronger in its primary category (Jaspersoft in embedded reporting and analytics and Spotfire in data discovery and visualization).

Jaspersoft personnel are now part of a bigger, more global infrastructure and workforce that will allow it to scale more easily and serve customers more fully.  Because of Jaspersoft’s commercial open source roots, its customers are all over the globe.  Now, as part of a much larger enterprise focused exclusively on mission-critical business software, the Jaspersoft team will be able to more efficiently work with customers wherever they are and satisfy a broader range of projects and needs than we would have as an independent company.  I am hopeful that, like me, our customers see this as a powerful next step for Jaspersoft.

What Stays the Same, Now That Jaspersoft is Part of TIBCO?
All the things about Jaspersoft that matter most to customers should remain the same: 
  • Our relentless focus on innovation, which propels us to create the most affordable reporting and analytics solutions to solve the next decade’s analytics problems; 
  • Our leadership in embedded reporting and analytics, which allows Jaspersoft tools and the insight they provide to become a contextually-relevant part of thousands of other pieces of software, helping millions of users each day to become more capably analytic;
  • Our being a great Open Source software provider, community member, and participant;
  • Our never-ending drive to build an efficient and successful operating model, which ensures our customers receive the greatest value in licensing and using our software;
  • Our laser focus on customer success, our highest priority goal, along with our recognizing that we succeed only when our customers’ succeed; and
  • Our constant quest to reach everyone with analytics; that is, our understanding that analytics must become a thing that you do, not a place that you go.
As we journey forward into 2015, I hope that you, our customers and partners, will let me know how we are doing against this list of things that will and won’t change.  My responsibility is to ensure that a broader, deeper and, ultimately, even more successful relationship emerges between our customers and TIBCO Analytics, of which Jaspersoft is a critical part.  As always, my email inbox is open and I’d like to hear from you.

Thank you for your business and partnership.  Best wishes in 2015.

Brian Gentile
Former Chairman & CEO, Jaspersoft
Now Sr. Vice President & General Manager, TIBCO Analytics
email: bgentile@tibco.com


January 5, 2015

February 11, 2014

What's Next for Big Data?

noreply@blogger.com (Brian Gentile)
The Open Book on BI )

Big Value.  Broad Usefulness.

The time for putting all data to work is now.  The cost of collecting and using all data is plummeting. Meanwhile, the tools to access and work with data are becoming simpler. New, big, multi-structured data sets combined with the significant architectural advances of cloud computing has created advantages accessible to any organization.  Most profoundly, the next few years will usher in a new class of application, one that uses all available data and is built on the superior economics of cloud-based computing. This class of applications will help make big data small and will be purpose-built to solve an enormous array of real business problems. This is what’s next for Big Data.

Cost reduction and improving economics are key ingredients to pervasive use of data.  Open source projects are fueling the rise in usefulness of Big Data and creating disruption on a grand scale.  Apache Hadoop and Cassandra are some of the best known examples, but represent just tips of the iceberg. A recent IDG survey showed that 70% of enterprises already have or are planning to implement a Big Data project and about 30% of respondents reported using an open source big data framework, such as Hadoop. Our own Big Data survey described a doubling of Big Data projects (versus our same survey last year) as well as a marked reduction (47% decrease) in the confusion about the definition and usefulness of Big Data. 

I’ve written previously about the amazing advancements in sensor technologies, storage resources and compute cycles – which create a perfect, low cost foundation for developing and delivering new applications that harness huge data sets to new effect. Two of my favorite examples of this new generation of applications in action are Clockwork and TrackX (formerly Fluensee).

Clockwork has created a modern, cloud-based Enterprise Asset Management system that is transforming the management of capital-intensive assets for any size of organization.  Clockwork recently reported that it has saved its customers more than $3 billion in the past year by combining sensors and sophisticated software to fundamentally improve the maintenance, repair and operations of their assets.

TrackX creates software and technology to effectively track and manage physical assets using RFID, barcode, GPS and sensor technologies with its rapidly-deployed, cloud-based software.  A key ingredient in TrackX solutions is analytics, because its mission is to provide insight on all asset-related data. TrackX takes all asset attributes and measurements and creates an analytical picture that provides customers with actionable intelligence to better optimize time, labor and asset productivity.  This insight is cleverly designed-in its application software, helping anyone in the organization to become more analytically-astute.

So, What’s Next for Big Data?

The two large growth areas for analytics are: 
1. making data scientists far more capable by giving them more useful and cost-effective tools than ever before, including Jaspersoft’s recent innovations enabling rapid blending of big and small data through data virtualization;
2. making everyone else in the organization more capably analytical by providing them with just the right amount of analytics at the right place and the right time (no more, no less) within the applications and processes they use every day. As a result, they can make better data-driven decisions than ever before.

The second growth area above will have the broadest, most transformational effect on an organization because it can involve everyone, regardless of level or title, making decisions on a daily basis. IDG’s recent survey found that 42% of the respondents plan to invest in these applications in 2014, making this one of the top five areas for investment. 

This investment level should provide us with great hope, because, although we’re in the very first phase of a revolution in data and analytics, the evidence is now clear.  Dramatically lower costs for capturing and using all the data, combined with a new breed of cloud-based analytical applications, will power huge gains in value and usefulness for data in any organization.

February 11, 2014

January 20, 2014

Four Trends That Will Shape 2014 

From my many travels and conversations in 2013, I've synthesized four primary trends that I believe
will make 2014 a transformative year for reporting and analytics. Think of this as my travel log and diary distilled into a short, easily readable series of blog posts. I invite you to follow this short series through my first, second and third installments and now my fourth (and final) below.  Your comments and ideas will surely help shape not only my perspective on the future of reporting and analytics, but Jaspersoft's product priorities as well. I look forward to the on-going dialog and I thank our customers and partners for their business and partnership, which mean everything to us.

Putting more data to work drives innovation.  Innovation can transform processes, products, services and people. Our newfound ability to cost-effectively analyze and find hidden patterns in huge swaths of data will enable a new generation of business-led technology innovation.  With this trend, the IT organization must find new ways to integrate and collaborate within the enterprise, becoming an enabler of business-led innovation.  This collaboration is more important than ever as technology now defines the new economic battleground for every industry and organization.  Even Gartner’s latest predictions abound with a Digital Industrial Revolution theme and watchwords for CIOs and their IT organizations to either lead or get out of the way.  It’s a bold new world.

All companies are now technology companies.  Every organization must put technology to work in ways that create distinction and competitive advantage.  Evidence of this trend can be found in any industry-leading company today, where IDC says that business units already control 61% of tech spending.  Fortunately, the technological barriers to entry have never been lower.  Organizations of all sizes now have affordable access to powerful, enterprise tools, which levels the playing field, allowing even small companies to compete with the big guys (sometimes even more effectively, because of their nimbleness).  One example is AirIT, which can help every airport become a technology-enabled data center, driven by metrics relevant to the business, which in turn, streamline operations and save money.

Leading enterprises will overtly staff, skill and organize to maximize innovative uses of technology – creating a cascade that will impact education, training and personnel management in all corners of the organization.  Even military organizations realize that gaining skill and expertise in data and analytics will remain at the forefront for personal advancement.  The risk for all is that a concentration of even deeper technology skills will create digital haves and have-nots within industries, creating a difficult spiral for laggards.

Lastly, in order for business-led tech innovation to really flourish, many more knowledge workers (than today) must have access to just the right amount of data and analysis, at the right place and the right time (not too much, not too little), which promises to make everyone into a more capable analyst and decision maker (regardless of job level, title and even skill).  In 2014, analytics becomes a thing that you do, not a place that you go and the need for intelligence inside of the applications and business processes we use every day becomes an agent of business-led tech innovation.

January 20, 2014

January 13, 2014

From my many travels and conversations in 2013, I've synthesized four primary trends that I believe
will make 2014 a transformative year for reporting and analytics.  Think of this as my travel log and diary distilled into a short, easily readable series of blog posts.  I invite you to follow this short series through my first and second installments and now my third below. Your comments and ideas will surely help shape not only my perspective on the future of reporting and analytics, but Jaspersoft's product priorities as well.  I look forward to the on-going dialog and I thank our customers and partners for their business and partnership, which mean everything to us.

Trend #3:  From Schema-Constrained to Idea-Constrained, The Real Big Data Opportunity

In the past (and too often today), we collected just the data that we could afford to store and for which we had a clear, known use.  In this sense, we were hard-wired to winnow the data down to its most obvious and practical subset; thus, we were (and are) schema-constrained. By this I mean that today we must know, in advance, the uses for data as they are being captured.  This mindset leaves little-to-no room for future, latent value that may exist within a data set.  In physics, we recognize that energy has an immediate value (kinetic) and a future value (latent).  Why should data be any different?  

As costs have declined and the power of technology has increased exponentially, we now have the ability to store and use ALL of the data, not just some of the data. But, we may not always know the value of this data while it is being captured.  That’s okay.  The latent value of data will become more obvious each year and the technology now exists for this to be the norm.  In this sense, the real big data opportunity is based on the scale of our ideas to put data to work, finding new correlation and value where it previously was not discernible.  

Unlocking this new value becomes easier as the world is increasingly digitized; that is, we now regularly put very new data types to work:  geo-position / location, sensor updates, click streams, videos and pictures, documents and forms, etc.  Just a few years ago, almost none of this would have been considered “data”. Commonly using all these new data types and searching for correlations that can positively impact business will shift the primary constraint to the quality and quantity of our ideas.  

Perhaps my favorite example of latent data use is the world’s first consumer light field camera from Lytro.  The Lytro camera captures and processes the entire light field (11 million rays of light, to be precise).  This allows every shot to be filtered and viewed from any perspective, after the shot, allowing the user to later uncover what the best focus, angle, and zoom, might yield.  Lytro refers to the result as a “living picture” with a 3-dimensional feel.  What a beautiful use of big, latent data.

In 2014, we’ll move from being schema-constrained to idea-constrained, more often finding the real value in Big Data.

January 13, 2014

January 6, 2014

From my many travels and conversations in 2013, I've synthesized four primary trends that I believe
will make 2014 a transformative year for reporting and analytics. Think of this as my travel log and diary distilled into a short, easily readable series of blog posts. I invite you to follow this short series with my first installment here and my second below.  Your comments and ideas will surely help shape not only my perspective on the future of reporting and analytics, but Jaspersoft's business intelligence product priorities as well.  I look forward to the on-going dialog and I thank our customers and partners for their business and partnership, which mean everything to us.

As cloud-originated data growth accelerates, the attractiveness of using additional cloud-based data management and analysis services will grow too. I’ve previously written about an entirely new generation of platform and middleware software that will emerge to satisfy the new set of cloud-based, elastic compute needs within organizations of all sizes.  Commonly at this software layer, utility-based pricing and scalable provisioning models will be chosen to more strictly match consumption patterns
with fees paid.  These improved economics, in turn, will enable broader use of platform and middleware software, especially reporting and analytics, than ever before possible.  

Additionally, cloud-based delivery portends a level of simplicity and ease-of-use (consumer-like services) that defies the earlier generation of enterprise software, ushering in deeper consumption of analytics by organizations of all sizes.  In short, cloud-based delivery becomes a key component of the quest for pervasive analytics – especially when those analytics are delivered as components within the web applications we use every day.

According to Nucleus Research: “As companies learn to take full advantage of the analytics functionalities that are now available with utility and subscription-based pricing options, they will continue to become more able to take advantage of market trends and opportunities before their peers and take advantage of the average return of $10.66 for every dollar spent in analytics.” In 2014, cloud-based delivery will change the analytics consumption pattern.

January 6, 2014

December 30, 2013

Four Trends That Will Shape 2014

noreply@blogger.com (Brian Gentile)
The Open Book on BI )

My favorite part of being CEO of Jaspersoft is spending time directly with our customers and partners.  I learn so much about how we are succeeding together and what Jaspersoft can do better to help them succeed more fully.  Learning from our customers is a passion at Jaspersoft.  We all take notes from our customer discussions and share them throughout the company in an effort to constantly grow our understanding and push us forward.  Their plans and ideas become our ambitions - to create a far better approach to the next generation of reporting and analytics.  Our hope and intention is that our customers will be compelled to rely on Jaspersoft for a continually larger portion of their projects and needs.

From my many travels and conversations in 2013, I've synthesized four primary trends that I believe will make 2014 a transformative year for reporting and analytics.  Think of this as my travel log and diary distilled into a short, easily readable series of blog posts.  I invite you to follow this short series, starting with my first installment here.  Your comments and ideas will surely help shape not only my perspective on the future of reporting and analytics, but Jaspersoft's product priorities as well.  I look forward to the on-going dialog and I thank our customers and partners for their business and
partnership, which mean everything to us.

Trend #1:  Forget Sentiment Analysis, Sensors + Software Will Change the World

Much of the Big Data hype has focused on social media and sentiment analysis, in an effort to get closer to the customer and better understand the market in which an organization competes.  While this is a valid goal, relatively few organizations will find both the skill and useful data patterns that add up to a material top-line difference.

Instead, the focus should be on the “Internet of Things”, for the transformative power it represents.  Every day, I see more powerful examples of sensors and software in action.  I prefer to describe it as “sensors + software” because these terms better symbolize the grittier, more real-world value that can be delivered by measuring, monitoring and better managing vast amounts of sensor-generated data. Why is this important in 2014?  Firstly, sensor technology has become remarkably low cost (an RFID tag, for instance, can cost as little as 50 cents, according to this report - which means more data points).  Secondly, the data storage and analytic technology to capture and analyze this data is incredibly low cost and widely available (often in open source editions). Lastly, sensor-based data is well suited for correlation analysis, rather than looking strictly for causation, which increases the potential for finding value among this machine-generated data.

Analyst predictions are vast for the economic and far-reaching value of making “Things” smarter and connecting them to the Internet.  Why limit analysis to the words and attitudes of a relatively vocal few (social media and sentiment analysis), when you can analyze the actual behavior of a much larger population (sensor data)? So, I believe a quiet revolution is already underway.  In 2014, sensors + software will change the world.

December 30, 2013

April 8, 2013


Last month, Jaspersoft announced the industry’s first completely pay-as-you-go reporting and analytic service on Amazon’s AWS Marketplace.  With this service, you can literally be up-and-running (analyzing your data) in less than 10 minutes and pay as little
as 52 cents per hour to do so.  And, as we’ve just announced, Amazon and Jaspersoft added more than 100 customers during the first month of availability – a great start to a new service destined to change the way BI is consumed for many purposes.

One of my favorite University professors recently asked me what worries me the most about being on the cutting edge with Amazon and this new service.  My response:  NOT being on the cutting edge with Amazon and this new service.  In other words, I would worry most about not innovating in this way.  Disrupting through both our business model and product innovation is a critical part of our culture at Jaspersoft.

In fact, the early success of our new Amazon-hosted service reminded me of two fast-emerging, inter-related cloud computing concepts that, though not discussed sufficiently, will have a substantial impact on the future usage and adoption of cloud-based computing services. These two concepts are: cloud-originated data and the post-transactional cloud *1.  I maintain that, as the former quickly grows, the latter becomes commonplace.

Cloud-Originated Data
While the total digital universe currently weighs in at nearly 3 zettabytes, it is estimated that more than one Exabyte of that data is stored in the cloud.  Each day, the growth rate of cloud-originated data increases, because of the explosion in services and applications that rely on the cloud as infrastructure.  So, a disproportionate amount of the 13X growth projected in the digital universe between now and 2020 will come from cloud-originated data. IDC estimates that by 2020, nearly 40% of all information will be “touched” by cloud computing (somewhere from origination to disposal).  Eventually, most of the digital universe will be cloud-based.

The growth in Amazon’s Simple Storage Service (S3) provides another compelling data point for the growth of cloud-originated data. In the past several years, Amazon’s S3 service has seen meteoric growth, now storing nearly one trillion objects (growing by 1 billion objects per day) and handling more than 650,000 requests per second (for those objects). The chart below illustrates this dramatic growth *2.



Importantly, cloud-originated data is more easily liberated (post-transaction) by other cloud services, which can unlock additional value easily and affordably.  According to a recent report by Nucleus Research, companies that more quickly utilize cloud-based analytics are likely to gain a competitive advantage:

“As companies learn to take full advantage of the analytics functionalities that are now available with utility and subscription-based pricing options, they will continue to become more able to take advantage of market trends and opportunities before their peers and take advantage of the average return of $10.66 for every dollar spent in analytics.”

Ultimately, analytics is just one of many important post-transactional uses of cloud-based
data, which will surely be the subject of future posts.

Post-Transactional Cloud
My working definition of the post-transactional cloud is “the next-generation of cloud services, beyond Software-as-a-Service (SaaS), designed to enable platform and middleware tools to use cloud-originated transactional data and deliver a richer, more sophisticated computing experience.”

The concept of a post-transactional cloud provides a powerful analog that mirrors the history of the on-premises computing world. Let me elaborate.

The ERP/CRM/Supply Chain application boom of the ‘80s and ‘90s preceded an enormous need in the ‘90s and ‘00s for additional tools and software systems designed specifically to create even greater value from the data generated by these (on-premises) transactional applications. Then, tools for data management, data integration, data warehousing and business intelligence (reporting and analytics) were born to deliver this new value.

Cloud computing has grown substantially in the last 10 years largely because of applications hosted in the cloud and made available as a service directly to consumers and businesses.  The poster
child here is Salesforce.com (although there are thousands of others).  Note that we call this category “Software-as-a-Service” when it really should be called “Application-as-a-Service” because the
providers in this category are delivering a transactional, process-oriented application designed to automate and improve some functional aspect of an organization.  As the use of these managed services/applications grows, so too does the quotient of cloud-originated data generated by these applications.

The dramatic rise in cloud-originated data from SaaS applications portends a similar need: this one for post-transactional cloud-based tools and software systems to define a new usage curve for liberating cloud-based data and creating substantially new organizational value. It’s just a matter of time. Which makes Jaspersoft’s work with Amazon clear and understandable.

In fact, Jaspersoft’s cloud-based service (across all major Platform-as-a-Service environments, such as VMWare’s CloudFoundry and Red Hat’s OpenShift, but right now, especially with Amazon’s AWS) helps ensure our tools are the de facto standard for reporting and analysis on cloud-originated data (in the post-transactional cloud). We’ll do this in two ways:
1. By bringing our BI service to customers who already prefer to use cloud services, and by being available in their preferred cloud instead of forcing them into our cloud; and
2.  By enabling elegant, affordable, embeddable reporting and analysis within cloud-based applications, so those who deliver this software can include intelligence inside their transactional applications.
At Jaspersoft, we ultimately see our cloud-based service as vital to reaching the broadest possible audience with just the right amount of reporting and analytics (not too much, not too little).  The post-transactional cloud will be fueled by cloud-originated data and the need to deliver cleverly-designed intelligence inside this environment will be more important than ever.

Brian
Gentile
CEO, Jaspersoft


1 I’ve borrowed the term “Post-Transactional Cloud” from ZDNet’s Andrew Brust, in his article entitled “Amazon Announces ‘Redshift” cloud data warehouse, with Jaspersoft support”.
2 Data and chart excerpted from TechCrunch article “Amazon S3: 905 Billion Objects Stored, 1 Billion Added Each Day”, Sarah Perez, April 6, 2012.


April 8, 2013

November 13, 2012

The Intelligence Inside: Jaspersoft 5

noreply@blogger.com (Brian Gentile)
The Open Book on BI )

For more than three years, Jaspersoft has envisioned the capabilities we’ve just announced in our v5 platform. Because we’ve always intentionally constrained ourselves by exclusively delivering client (end-user) reporting and analysis functionality inside the web browser, our quest for v5 took longer than we would have wanted. But, we believe that the strengths and advantages of maintaining our simple, pure, web-server-based approach to advanced business intelligence is superior to relying on desktop-specific code or even browser plug-ins, which must be installed and maintained on every computer, preventing the scale and cost advantages Jaspersoft can offer.

So the interface techniques and features we deliver are constrained based on key web client technologies, especially HTML. The trade-offs we’ve lived with in the past, though, are now essentially eliminated, as a new generation of HTML5 ushers in the consistent, advanced visualization and interaction we’ve long-wanted, while allowing us to maintain our pure web-based client delivery model. Satisfaction. Jaspersoft 5 is more than a new pretty face. We have delivered a completely new HTML5 visualization engine that allows a new-level of rich graphics and interaction, but we’re also providing a host of new and more advanced back-end services that make Jaspersoft 5 more surely the intelligence inside apps and business processes. In total, Jaspersoft 5 includes six major new features.

1. Data Exploration 
To enable everyone to become a more capable analyst, the Jaspersoft 5 platform includes stunning HTML5 charts, a new dimensional zoom tool (for exploring data at more or less levels of detail), and the ability to simply change or customize charts and tables to suit a particular type of thought or analysis.

2. Data Virtualization 
Some reporting and analysis applications are best delivered without moving or aggregating data. Instead, the query engine should virtualize those data views and enable reports, dashboards and analytic views to include data from all necessary sources. Jaspersoft 5 includes an advanced data virtualization engine so that building advanced analysis using practically any data source is straightforward, including Big Data sources.

3. Columnar In-Memory Engine 
The JasperReports server has supported in-memory operations for several years. Jaspersoft 5 takes this to a new level with improved performance, features, and now with support for up to a full Terabyte of in-memory data. This means that billions of rows of data can be explored at memory speeds with our new Server.

4. Enhanced Analytics 
To give the power user analyst another reason to use Jaspersoft, we’re now including greater analytic performance, new analytic features (e.g., conditional formatting, relative date filtering, and cross-tab sorting), consistently rich visualization (see #1 above) and broadened access to multi-dimensional data sources. By supporting the latest XML/A standard, we gain certified access to Microsoft SQL Analysis Services (MSAS) data sources in addition to the traditional Mondrian. More power and greater choice equals greater usage.

5. Improved Administration and Monitoring 
To make the lives easier of those who administer and manage a JasperReports Server, we’re now using our own tool to make our Server smarter and simpler. We’ve designed a set of best-practice, interactive reports that display system health and report on the most important elements of usage. Then, we streamlined the installation and upgrade process, so that getting started and staying up-to-date has never been easier. Together, these improvements are good for our customers and our technical team who supports them.

6. PHP Support 
Scripting tools are now the most popular for web application development. The PHP community needs more advanced reporting and analysis tools to make their applications more data-driven. By extending the JasperReports Server API to now include PHP support (via RESTful web service wrappers), we’ve taken an important first step toward supporting this fast-growing world beyond Java. Welcome to Jaspersoft.

Jaspersoft 5 is poised to deliver self-service BI to help many more users answer their own questions, not just because of the beautiful new HTML5 graphing and interaction engine, but because it is designed to be highly embeddable (into apps and business processes) and, maybe most importantly, because it scales so powerfully and affordably. Putting reporting and analytics into the hands of far more users requires this fundamental reset of the BI formula. This is Jaspersoft 5.

I invite you to learn more about Jaspersoft 5 here. And, I look forward to your comments and questions. 

Brian Gentile 
Chief Executive Officer 
Jaspersoft

November 13, 2012

July 27, 2012

Big Data: Approaches, Myths & Skills

noreply@blogger.com (Brian Gentile)
The Open Book on BI )


Last month, my 18-year old daughter asked me about Big Data. This is my first sure sign that a technology has reached a fever pitch in the hype cycle.  Ironically, I found that as I explained this
enterprise IT topic to my daughter, our conversation and the questions she asked did not vary greatly from many conversations I’ve had with other CEOs, journalists, financial analysts and industry colleagues.  Despite how widely Big Data is being covered these days, it appears to me that Big Data is a big mystery to many.


Trying not to be labeled a cynic, I have three big worries about Big Data:

1. My biggest worry is the poor percentage of successful Big Data projects that will emerge as we too quickly throw these new technologies at a wide variety of prospective projects in the enterprise
2. The low success rate of Big Data projects will be amplified by the current hype and subsequent misconceptions about Big Data technologies, and
3. This low project success rate could stay challenged over time because of the relative dearth of
knowledgeable, data-savvy technology and business professionals ready for a world where data are plentiful and analytic skills are not.

Successful Big Data Projects

As organizations race to evaluate and pilot Big Data tools and technologies, in search of an answer to a Big Data opportunity, I’ve seen evidence that architectural steps are being skipped in favor of speed.  Sometimes, speed is good.  In the case of Big Data, building the right data and platform architecture is critical to actually solving the business problem, which means the right amount of thoughtful planning should occur in advance.  Many missteps could be avoided by simply being clear up-front on the business problem (or opportunity) to be solved and how quickly the data must be used to enable a solution (i.e., how much latency is acceptable?).

Recently, I’ve tried to do my part to help explain successful Big Data (technical) architectures by starting with three simple, latency-driven approaches.  The specifics, including an architectural diagram, are described in my recent E-Commerce Times article, entitled “Match the Big Data Job to the Big Data Solution.” We’ve also posted additional graphics and explanation to the Big Data section of the Jaspersoft website.

Big Data Misconceptions (or Myths)
To reduce the hype, first we must overcome the misconceptions. My many conversations on the topic of Big Data yield equally many misconceptions and misunderstanding. Some examples of the most common myths: Big Data is all unstructured, Big Data means Hadoop and Big Data is just for sentiment analysis. Or course, each of these myths is only partially true and requires a deeper understanding of the technologies and their potential uses to gain real clarity.


I’ve recently offered a brief article that seeks to dispel the “Top 5 Myths About Big Data.” Published last month on Mashable. The article has garnered some great comments with the most completewritten by IBM’s James Kobielus. James improves and amplifies several of my major points. I hope you’ll join the conversation.

Analytic Skills Shortage
Worldwide digital content will grow 48% in 2012 (according to IDC), reaching 2.7 zettabytes by the end of the year.   As a result, big data expertise is fast-becoming the “must-have” expertise in every organization.  At the same time, in its 2011 research report, titled “Big data: The Next Frontier for Innovation, Competition, and Productivity,” McKinsey offered the following grim statistic:

“By 2018, the United States alone could face a shortage of 140,000 to 190,000 people with deep analytical skills as well as 1.5 million managers and analysts with the know-how to use the analysis of big data to make effective decisions.”

Without the solid analytic skills needed to support a growing array of Big Data projects, the risk potential grows rapidly.  Anyone in or near data science should take the coming skills shortage as a call-to-arms.  Every college and university should be building data analytics coursework into compulsory classes across a wide variety of disciplines and subject areas. Because of its importance, I’ll save this Big Data skills topic as the thesis for a future post.

Despite these primary worries, I remain hopeful (even energized) by the enormous Big Data opportunity ahead of us.  My hope is that, armed with good information and good technology, more Big Data customers and projects will become more quickly successful.

Brian Gentile
Chief Executive Officer
Jaspersoft Corporation

July 27, 2012

Pages

Feedback