There was a time when being a power Excel user gave you an edge in your resume – it implied that you can perform your tasks more efficiently than your peers. Excel remains one of the most commonly used tool for analyzing data today and can be applied in various scenarios such as financial modeling, data analytics, or even invoicing. However, it is running into a wall of limitations as the nature and use of data become more complex
Excel remains a powerful tool for data analysis today. But as the volume of data collected increases exponentially, processing it requires data analysts to use better tools beyond desktop software. This renders Excel to gradually become a relic of the past for data processing.
Businesses now collect millions of data points across multiple countries. To determine which campaigns are the most optimal in boosting conversion rates or the types of customer segments to target, data analysts have to comb through data collected from millions of transactions on a daily basis. Traditional tools would simply have been overwhelmed by such tasks. In order to begin the high-quality analysis required and make sense of the massive treasure trove of data collected, one must go back to understanding the source of data – the database.
You see, good information provides superior insights that drive great decisions. The party that is armed with greater intelligence of their stakeholders gains a competitive edge over their peers. In the hands of a competent data analyst, this edge can be sharpened to help a company outperform its competitors.
The trend is not just limited to internet start-ups. While data collection may be easier on the web, there is nothing to stop physical retail businesses from actively deploying technology to achieve the same as its online brethren. For example, Point-Of-Sales systems in the offline world are capturing more data points with each transaction which are then stored in the cloud. Valuable insights can be drawn for those who know how to harness the power of data.
The race to make sense out of data gave birth to business intelligence tools and data analysts. The problem is, while much of the existing solutions available in the market are designed to be easy-to-use, that also translates to a loss in flexibility for analysts and scientists who are already master slicers and dicers of databases.
Just like how Adobe Final Cut or Photoshop are not the easiest of tools to learn in the realm of video and graphic manipulation, but provide incredible control for users who have mastered the software, one should not expect a simple drag-and-drop interface to be sufficient for the complex queries that data analysts ask from their data. In some cases, these drag and drop interfaces may not be that intuitive for these data analysts and may add unnecessary learning curves.
For a data analyst to reach the full potential of his capabilities and perform his job effectively, we must go back to the core – the database – rather than be confined within the limits of rigid software.
Working with databases has traditionally been viewed as a dimension that is out of bounds to non-geeks. Faced with entire shelves of books on database administration, laymen are naturally intimidated by the daunting task of selecting one that is both suitable and understandable, not to mention the rarity of coming across one on using SQL as a reporting tool.
We think that is about to change. Similar to how Excel brought spreadsheet skill-sets to the mass market (where other spreadsheet software with similar features before Excel failed to do so), we believe a tool will emerge to bring SQL skill-sets mainstream to process large volumes of data.
Data analysts that are willing to learn and embrace SQL will find it an invaluable companion to their Excel skills, and better prepared to handle the complexity of data they will face.
Co-authored by Vincent Woon and Wilson Ong. This article also appeared on Tech In Asia