Advance Big Data Science

Take the next leap from Hadoop to learn Big Data analytics using R, Python, Spark along with Hadoop

Do you know that Apache Spark has risen to become the most active open source project in big data? No wonder McKinsey Global Institute estimates shortage of 1.7 million Data Science and Big Data professionals over next 3 years.


Considering this increasing gap in the demand and supply with the help of this Advance Data Science training, IT/ ITES professionals can bag lucrative opportunities and boost their career by gaining sought after Big Data Analytics skills.


This is our advanced Big Data training, where attendees will gain practical skill set not only on Hadoop in detail, but also learn advanced analytics concepts through Hadoop components,Python, R and Spark. For extensive hands-on practice, candidates will get access to the virtual lab and several assignments and projects. At end of the program candidates are awarded Advance Data Science Certification on successful completion of projects that are provided as part of the training.


A completely industry relevant Big Data Analytics training and a great blend of analytics and technology, making it quite apt for aspirants who want to develop Big Data skills and head-start in Big Data Analytics!


Course duration: 180 hours (Atleast 90 hours live training + 12 hours video based training + Practice and Self-study, with ~8hrs of weekly self-study)

Who Should do this course?

Students coming from IT, Software, Datawarehouse background and wanting to get into the Big Data Analytics domain

Enroll to this course

Combo Deals!

Learn more, save more.
See our combo offers here.

Course Duration 180 hours
Classes 30
Tools R, Hadoop, Python, Spark
Learning Mode Live Training
Next Batch22nd January, 2017

  • What is Data Science?
  • Data Science Vs. Analytics vs. Data warehousing, OLAP, MIS Reporting
  • Relevance in Industry and need of the hour
  • Type of problems and objectives in various industries
  • How leading companies are harnessing the power of Data Science?
  • Different phases of a typical Analytics / Data Science projects
  • Overview of Python- Starting Python
  • Introduction to Python Editors & IDE's (Canopy, pycharm, Jupyter, Rodeo, Ipython etc…)
  • Custom Environment Settings
  • Concept of Packages/Libraries - Important packages (NumPy, SciPy, scikit-learn, Pandas, Matplotlib, etc)
  • Installing & loading Packages & Name Spaces
  • Data Types & Data objects/structures (Tuples, Lists, Dictionaries)
  • List and Dictionary Comprehensions
  • Variable & Value Labels – Date & Time Values
  • Basic Operations - Mathematical - string - date
  • Reading and writing data
  • Simple plotting
  • Control flow
  • Debugging
  • Code profiling
  • Importing Data from various sources (CSV, Txt, Excel, Access etc…)
  • Database Input (Connecting to database)
  • Viewing Data objects - subsetting, methods
  • Exporting Data to various formats
  • Cleansing Data with Python
  • Data Manipulation steps (Sorting, filtering, duplicates, merging, appending, subsetting, derived variables, sampling, Data type conversations, renaming, formatting etc)
  • Data manipulation tools (Operators, Functions, Packages, control structures, Loops, arrays etc)
  • Python Built-in Functions (Text, numeric, date, utility functions)
  • Python User Defined Functions
  • Stripping out extraneous information
  • Normalizing data
  • Formatting data
  • Important Python Packages for data manipulation (Pandas, Numpy etc)
  • Introduction exploratory data analysis
  • Descriptive statistics, Frequency Tables and summarization
  • Univariate Analysis (Distribution of data & Graphical Analysis)
  • Bivariate Analysis (Cross Tabs, Distributions & Relationships, Graphical Analysis)
  • Creating Graphs- Bar/pie/line chart/histogram/boxplot/scatter/density etc)
  • Important Packages for Exploratory Analysis (NumPy Arrays, Matplotlib, Pandas and scipy.stats etc)
  • Basic Statistics - Measures of Central Tendencies and Variance
  • Building blocks - Probability Distributions - Normal distribution - Central Limit Theorem
  • Inferential Statistics -Sampling - Concept of Hypothesis Testing
  • Statistical Methods - Z/t-tests (One sample, independent, paired), Anova, Correlations and Chi-square
  • Making Python talk to other languages and database systems
  • How do R and Python play with each other, why it's essential to know both
  • Introduction to Hadoop
  • Hadoopable Problems - Uses of Big Data analytics in various industries like Telecom, E- commerce, Finance and Insurance etc
  • Problems with Traditional Large-Scale Systems & Existing Data analytics Architecture
  • Key technology foundations required for Big Data
  • Comparison of traditional data management systems with Big Data management systems
  • Evaluate key framework requirements for Big Data analytics
    Apache projects in the Hadoop Ecosystem
  • Hadoop Ecosystem & Hadoop 2.x core components
  • Explain the relevance of real-time data
  • Explain how to use Big Data and real-time data as a Business planning tool
  • HDFS Overview & Data storage in HDFS
  • Get the data into Hadoop from local machine (Data Loading Techniques) - vice versa
  • MapReduce Overview (Traditional way Vs. MapReduce way)
  • Concept of Mapper & Reducer
  • Understanding MapReduce program skeleton
  • Running MapReduce job in Command line
  • Introduction to PIG - MapReduce Vs Pig, Pig Use Cases
  • Pig Latin: Program & Execution
  • Pig Latin: Relational Operators, File Loaders, Group Operator, COGROUP Operator, Joins and COGROUP, Union, Diagnostic Operators, Pig UDF
  • Use Pig to automate the design and implementation of MapReduce applications
  • Data Analysis using Pig
  • Introduction to Hive - Hive Vs. PIG - Hive Use Cases
  • Discuss the Hive data storage principle
  • Explain the File formats and Records formats supported by the Hive environment
  • Perform operations with data in Hive
  • Hive QL: Joining Tables, Dynamic Partitioning, Custom Map/Reduce Scripts
  • Hive Script, Hive UDF
  • Introduction to Impala & Architecture
  • How Impala executes Queries and its importance
  • Hive vs. PIG vs. Impala
  • Extending impala with User Defined functions
  • Improving Impala Performance
  • Introduction to NoSQL Databases, types, and Hbase
  • HBase v/s RDBMS, HBase Components, HBase Architecture
  • HBase Cluster Deployment
  • Introduction to Zookeeper/Oozie/Sqoop/Flume
  • Introduction to Apache Spark
  • Streaming Data Vs. In Memory Data
  • MapReduce Vs. Spark
  • Modes of Spark
  • Spark Installation Demo
  • Overview of Spark on a cluster
  • Spark Standalone Cluster
  • Invoking Spark Shell
  • Creating the Spark Context
  • Loading a File in Shell
  • Performing Some Basic Operations on Files in Spark Shell
  • Building a Spark Project with sbt
  • Running Spark Project with sbt
  • Caching Overview
  • Distributed Persistence
  • Spark Streaming Overview (Example: Streaming Word Count)
  • Analyze Hive and Spark SQL Architecture
  • Analyze Spark SQL
  • Context in Spark SQL
  • Implement a sample example for Spark SQL
  • Integrating hive and Spark SQL
  • Support for JSON and Parquet File Formats Implement Data Visualization in Spark
  • Loading of Data
  • Hive Queries through Spark
  • Performance Tuning Tips in Spark
  • Shared Variables: Broadcast Variables & Accumulators
  • Hadoop - Python Integration
  • Spark - Python Integration (PySpark)
  • Introduction to Machine Learning & Predictive Modeling
  • Types of Business problems - Mapping of Techniques
  • Major Classes of Learning Algorithms -Supervised vs Unsupervised Learning
  • Different Phases of Predictive Modeling (Data Pre-processing, Sampling, Model Building, Validation)
  • Concept of Overfitting (Bias-Variance Trade off) & Performance Metrics
  • Types of validation (Bootstrapping, K-Fold validation etc)
  • Linear Regression
  • Logistic Regression
  • Segmentation - Cluster Analysis (K-Means)
  • Decision Trees (CHAID/CART/CD 5.0)
  • Artificial Neural Networks (ANN)
  • Support Vector Machines (SVM)
  • Ensemble Learning (Random Forest, Bagging & boosting)
  • Other Techniques (KNN, Naïve Bayes, LDA/QDA etc)
  • Important Packages for Machine Learning (Sci Kit Learn, scipy.stats etc)
  • Introduction R/R-Studio - GUI
  • Concept of Packages - Useful Packages (Base & other packages) in R
  • Data Structure & Data Types (Vectors, Matrices, factors, Data frames,  and Lists)
  • Importing Data from various sources
  • Database Input (Connecting to database)
  • Exporting Data to various formats)
  • Viewing Data (Viewing partial data and full data)
  • Variable & Value Labels –  Date Values
  • Data Manipulation steps (Sorting, filtering, duplicates, merging, appending, subsetting, derived variables, sampling, Data type conversions, renaming, formatting, etc)
  • Data manipulation tools(Operators, Functions, Packages, control structures, Loops, arrays, etc)
  • R Built-in Functions (Text, Numeric, Date, utility)
  • R User Defined Functions
  • R Packages for data manipulation(base, dplyr, plyr, reshape, car, sqldf, etc)
  • Introduction exploratory data analysis
  • Descriptive statistics, Frequency Tables and summarization
  • Univariate Analysis (Distribution of data & Graphical Analysis)
  • Bivariate Analysis(Cross Tabs, Distributions & Relationships, Graphical Analysis)
  • Creating Graphs- Bar/pie/line chart/histogram/boxplot/scatter/density etc)
  • R Packages for Exploratory Data Analysis(dplyr, plyr, gmodes, car, vcd, Hmisc, psych, doby etc)
  • R Packages for Graphical Analysis (base, ggplot, lattice,etc)
Analyzing e-commerce data using Pig-Hive-Impala
Objective: The objective of the case study is to give good hands-on experience by analysing retail data using Pig, Hive and Impala Problem Statement: One of the leading e-commerce companies would like to understand consumer behaviour using Hadoop-Pig-Hive-Impala
Key Drivers for Customer Spending
Objective: The objective of the case study is to provide end to end steps to build and validate regression model to identify the key drivers of customer spend using Python-Spark. Problem Statement: One of the leading banks would like to identify key drivers for customer spending so that they can define strategy to optimize the product features.
Predicting bad customers(Default customer) using Credit customer application data
Objective: The objective of the case study is to provide end to end steps to build and validate classification model using python-spark Problem Statement: One of the leading banks would like to predict bad customers (Defaulters) based on the customer data provided by them in their application
Telecom Customer Segmentation
Objective: The objective of the case study to apply advanced algorithms like factor and cluster analysis for data reduction and customer segmentation based on the customer behavioural data Problem Statement: Build an enriched customer segmentation and profile them using different KPIs for one of the leading telecom company to define marketing strategy
Air Passengers Forecasting
Objective: The objective of the case study to given hands-on experience on how to apply/use different time series forecasting techniques (Averages/Smoothening, decomposition, ARIMA etc) Problem Statement: One of the leading travel companies would like predict number of air passengers travelling to Europe so that they can define their marketing strategy accordingly

Access to 72 hours instructor led live classes of 24x3 hours each, 12 hours of video-based classes of 4X3 hours each, spread over 14 weekends

Video recordings of the class sessions for self study purpose

Weekly assignment, reference codes and study material in PDF format

Module wise case studies/ projects

Career guidance and career support post the completion of some selected assignments and case studies

What if I miss a class?

Don’t worry. You will always get a recording for the class in your inbox. Have a look at that and reach out to the faculty in case of doubts. All our live classes are recorded for self-study purpose and future reference, and these can also be accessed through our Learning Management System. Hence, in case you miss a class, you can refer to the video recording and then reach out to the faculty during their doubts clearing time or ask your question in the beginning of the subsequent class.

You can also repeat any class you want in the next one year after your course completion.

For how long are the recordings available to me?

6 months post your course completion. If needed, you can also repeat any number of classes you want in the next one year after course completion.

Virtually the recordings are available to you for lifetime, but for judicious use of IT resources, the access to these recordings get deactivated post 6 months, which can be extended upon requests.

Can I download the recordings?

No. Our recordings can be accessed through your account on LMS or stream them live online at any point of time though.

Recordings are integral part of AnalytixLabs intellectual property by Suo Jure. The downloading/distribution of these recordings in anyway is strictly prohibited and illegal as they are protected under copyright act. Incase a student is found doing the same, it will lead to an immediate and permanent suspension in the services, access to all the learning resources will be blocked, course fee will be forfeited and the institute will have all the rights to take strict legal action against the individual.

What if I share my LMS login details with a friend?

The sharing of LMS login credentials is unauthorized, and as a security measure, if the LMS is accessed by multiple places, it will flag in the system and your access to LMS can be terminated.

Will I get a certificate in the end?

Yes. All our course are certified. As part of the course, students get weekly assignments and module-wise case studies. Based on the selected assignments and case studies (atleast 70%), the certificate shall be awarded.

Do you help in placements?

We follow a comprehensive and a self-sustaining system to help our students with placements. This is a win-win situation for our candidates and corporate clients. As a pre-requisite for learning validation, candidates are required to submit the case studies and project work provided as a part of the course (flexible deadline). Support from our side is continuous and encompasses help in profile building, CV referrals through our ex-students, HR consultants and companies directly reaching out to us.

We will provide guidance to you in terms of what are the right profiles for you based on your education and experience, interview preparation and conducting mock interviews, if required. The placement process for us doesn’t end at a definite time post your course completion, but is a long relationship that we will like to build.

Do you guarantee placements?

No institute can guarantee placements, unless they are doing so as a marketing gimmick! It is on a best effort basis.

In professional environment, it is not feasible for any institute to do so, except for a marketing gimmick. For us, it is on a best effort basis but not time – bound – in some cases students reach out to us even after 3 years for career support.

Do you have a classroom option?

Yes we have classroom option for Delhi-NCR candidates. However, most of our students end up doing instructor led live online classes, including those who join classroom in the beginning. Based on the student feedback, the learning experience is same both in classroom and instructor led live online fully interactive mode.

How do I attend the online classes? Are they interactive or self-paced?

We provide both the options and for instructor led live online classes we use the gold standard platform used by the top universities across the globe. These video sessions are fully interactive and students can chat or even ask their questions verbally over the VoIP in real time to get their doubts cleared.

What do I need to attend the online classes?

To attend the online classes, all you need is a laptop/PC with a basic internet connection. Students have often shared good feedback of attending these live classes through their data card or even their mobile 3G connection, though we recommend a basic broadband connection.

For best user experience, a mic-headphone is recommended to enhance the voice quality, though the laptop’s in-built mic works fine and you can ask your question over the chat as well.

How can I reach out to someone if I have doubts post class?

Through the LMS, students can always connect with the trainer or even schedule one-to-one time over the phone or online. During the course we also schedule periodic doubts-clearing classes though students can also ask doubts of a class in the subsequent class.

LMS also has a discussion forum where a lot of your doubts might get easily answered.

Incase you are having a problem still, repeat the class and schedule one-to-one time with the trainer.

What is your refund policy?
  • Instructor Led Live online or Classroom - Within 7 days of registartion date and latest 3 days before batch start
  • Video-based - 2 days
Can I pay in installments?

Yes. While making the fee payment, most of the courses have the installment option.

I am having difficulty coping up with my classes. What can I do?

For all the courses, we also provide the recordings of each class for their self-reference as well as revision in case you miss any concept in the class. In case you still have doubts after revising through the recordings, you can also take one-to-one time with the faculty outside classes during. Furthermore, if students want to break their courses in different modules, they get one year time to repeat any of the classes with other batches.

What are the system requirements for the software?

It is recommended to have 64-bit operating system with minimum 8GB RAM so that the virtual lab can be installed easily

Analytixlabs has a very good course structure which keeps on getting updated with demand in the job market.I did the Data science with SAS & R program and recommend this course who wish to switch in analytics. Even if you miss any classes you can go through the recordings in their LMS. Trainers are very helpful and will clear your doubts through follow up classes.

- Sourabh Nanda (Business Analyst, Cognizant Analytics)
Have Questions?
Contact us and we shall
get back with answers.

Change the course of your career

2000+ students have already registered to our courses. Learn analytics from the experts.
Course Brochure
Upcoming Batches
Student Reviews