Advanced Analytics uses sophisticated tools for granular data analysis to enable forecasts and predictions from data.
Quick Takeaway: Advanced analytics is a very effective form of data analysis because it allows you to dig deeper into data to predict the future of your business.
Analytics is a process that involves identification, interpretation, and communication of critical patterns found in raw data. This information is used by organizations to make strategic decisions that impact performance.
Quick Takeaway: Analytics puts your data to work.
(aka Outlier Analysis) is a technique used to identify a random pattern in data, namely anomaly, that does not conform to expected behavior. This method has a range of real-world applications such as intrusion detection (strange patterns in network traffic signaling a hack) health monitoring (identifying malignant tumors in MRI scans), fraud detection (credit card transactions), technical glitches (malfunctioning equipment) and changes in consumer behavior.
Quick Takeaway: Anomaly Detection helps find unusual activity in data thereby indicating an area that needs further investigation.
Artificial Intelligence (AI) is the ability of a machine to perform actions that otherwise require human intelligence. For instance, tasks like visual perception, speech recognition, translation between languages, and decision-making, are supported and automated by AI using intelligent machines.
Quick Takeaway: AI enables computer programs to think and act like intelligent humans. Minus the mood swings.
Augmented Analytics uses advanced technology to independently examine data, reveal hidden patterns, provide insights, make predictions, and generate recommendations. Artificial Intelligence and Machine Learning tools are used to automate the end-to-end process, right from data preparation, insight generation, and explanation, to augmenting the output with visualization and narratives.
Quick Takeaway: Augmented Analytics has revolutionized the way people explore and analyze data on BI and analytics platforms. It is unanimously crowned “The Future of Business Intelligence.”
Behavioral Analytics is a part of Business Intelligence that uses data to focus on how and why users behave the way they do, on social media platforms, eCommerce sites, while playing online games, and when using any other web application.
Quick Takeaway: Behavioral Analytics follows virtual data trails to gain insights into user behavior online.
Big Data includes a variety of structured and unstructured data, sourced from documents, emails, social media, blogs, videos, digital images, satellite imagery, and data generated by machines/sensors. It comprises large and complex datasets, which cannot be processed using traditional systems.
Quick Takeaway: Big Data is large volumes of data generated at high speeds, in multiple formats that can be of value when analyzed.
Business Intelligence (BI) is analyzing data and presenting actionable insights to stakeholders to help them make informed business decisions.
Quick Takeaway: BI enables the right use of information by the right people for the right reasons.
Clustering groups data points from multiple tables, with similar properties, together for statistical analysis.
Quick Takeaway: Clustering is an Unsupervised Machine Learning technique.
Dashboard is a tool used to create and deploy reports. It helps monitor and analyze key metrics on a single screen and see the correlations between them.
Quick Takeaway: Dashboard provides an overview of the reports and metrics that matter most to you.
Data Blending is a fast and easy method to extract data from multiple sources and blend it into one functional dataset.
Quick Takeaway: Data Blending combines data and finds patterns without the hassle of deploying a data warehouse architecture, which is why it is preferred.
Data Cleaning is also referred to as data cleansing or scrubbing. It improves data quality through the detection and removal of inconsistencies and errors found in data.
Quick Takeaway: Data Cleaning transforms data from its original state into a standardized format to maximize the effect of data analysis.
Data Cube is the grouping of data into multidimensional hierarchies, based on a measure of interest.
Quick Takeaway: Data Cube helps interpret a stack of data.
Data Democratization enables all users to access and analyze data freely to answer questions and make decisions.
Quick Takeaway: Data Democratization is a ‘free-for-all’ access to data and its use. No holds barred.
Data Fabric is a unified environment of data services that provide consistent capabilities namely, data management, integration technology, and architecture design being delivered across on-premises and cloud platforms. A data fabric ensures complete automation of data access and sharing.
Quick Takeaway: Data Fabric puts the management and use of data into high gear using technology.
Data Wrangling (aka Data Munging, Data Transformation) is the process of unifying acquired datasets with actions like joining, merging, grouping, concatenating, etc. and cleansing it for easy access and further analysis.
Quick Takeaway: Data Wrangling is the step between data acquisition and data analysis.
Diagnostic Analysis (aka Root Cause Analysis) takes over from Descriptive Analysis to answer the question Why it happened. It drills down to find causes for the outcomes and identify patterns of behavior.
Quick Takeaway: Diagnostic Analysis provides reasoning for the outcomes of the past by breaking down the data for closer inspection.
Drill-Down Capability helps visualize data at a granular level by providing flexibility to go deep into specific details of the information required for analysis. It is an important feature of Business Intelligence because it makes reporting a lot more useful and effective.
Quick Takeaway: Drill-Down Capability offers an interactive method to display multi-level data on request without changing the underlying query.
Exception Handling is the process of responding to unexpected events (exceptions) encountered when a predefined set of steps is executed.
Quick Takeaway: Exception Handling deals with unexpected instances that may arise when an action is performed.
Feature Engineering creates new features from raw data using data mining methods.
Quick Takeaway: If data cleaning is a subtractive process, feature engineering can be looked at as an additive process.
Key Performance Indicator (KPI) (aka Key Metric) is an important indicator that helps measure a department or organization’s performance and health.
Quick Takeaway: KPIs indicate how a business is performing based on certain parameters.
Machine Learning is an application of AI that enables computer applications to learn without specific programming using large datasets and improve when exposed to new data. ML is used to automate the building of analytical models.
Quick Takeaway: ML is the ability of machines to self-learn based on data provided and accurately identify instances of the learned data.
Metadata provides information about other data within a database. It provides references to data, which makes finding and working with collected data for the end-user easier in some cases.
Quick Takeaway: Metadata is data about data. For instance, username, date created/modified, file size are basic document metadata.
Predictive Analysis uses summarized data to answer the question What is likely to happen. It uses past performance to make logical predictions of future outcomes. It uses statistical modeling to forecast estimates. The accuracy of these estimates depends largely on data quality and details used. It is widely used across industries to provide forecasts and risk assessment inputs in various functions namely Sales, Marketing, HR, Supply Chain, Operations.
Quick Takeaway: Predictive Analysis attempts at predicting the future using advanced technology and skilled resources to analyze data and not look into a crystal ball.
Prescriptive Analysis is the last level of data analysis wherein insights from other analyses (Descriptive, Diagnostic, Predictive) are combined and used to determine the course of action to be taken in a situation. Needless to say, the technology used is a lot more advanced and so are the data practices.
Quick Takeaway: Prescriptive Analysis prescribes data-driven next steps for decision making.
Slice and Dice refers to the division of data into smaller uniform sections, that present the information in diverse and useful ways. For instance a pivot table in a spreadsheet.
Quick Takeaway: Slice and Dice is the breakdown of data into smaller parts to reveal more information.
Snapshot refers to the state of a dataset at a given point in time.
Quick Takeaway: A snapshot provides an instant copy of data, captured at a certain time.
Software as a Service (SaaS) is a delivery model for software, centrally hosted by the vendor and licensed to customers on a pay-for-use or subscription basis.
Quick Takeaway: SaaS packages and sells software as a service.