Advance Big Data Science

Learn all that is necessary to be a Data Scientist - Python, Hadoop, Spark with Machine Learning

Do you know that Apache Spark has risen to become the most active open source project in big data? No wonder McKinsey Global Institute estimates shortage of 1.7 million Data Science and Big Data professionals over next 3 years.


Considering this increasing gap in the demand and supply with the help of this Advance Data Science training, IT/ ITES professionals can bag lucrative opportunities and boost their career by gaining sought after Big Data Analytics skills.


This is our advanced Big Data training, where attendees will gain practical skill set not only on Hadoop in detail, but also learn advanced analytics concepts through Python, Hadoop and Spark. For extensive hands-on practice, candidates will get access to the virtual lab and several assignments and projects. At end of the program candidates are awarded Advance Data Science Certification on successful completion of projects that are provided as part of the training.


A completely industry relevant Big Data Analytics training and a great blend of analytics and technology, making it quite apt for aspirants who want to develop Big Data skills and head-start in Big Data Analytics!


Course duration: 150 hours (Atleast 105 hours live training + Practice and Self-study, with ~8hrs of weekly self-study)

Who Should do this course?

Students coming from IT, Software, Datawarehouse background and wanting to get into the Big Data Analytics domain

Enroll to this course

Combo Deals!

Learn more, save more.
See our combo offers here.

Course Duration 150 hours
Classes 35
Tools Python, Spark,Hadoop
Learning Mode Live Training
Next Batch18th June, 2017 (Gurgaon)

Introduction to Data Science

  • What is Data Science?
  • Data Science Vs. Analytics vs. Data warehousing, OLAP, MIS Reporting
  • Relevance in Industry and need of the hour
  • Type of problems and objectives in various industries
  • How leading companies are harnessing the power of Data Science?
  • Different phases of a typical Analytics / Data Science projects

Python: Introduction and Essentials

  • Overview of Python- Starting Python
  • Introduction to Python Editors & IDE's (Canopy, pycharm, Jupyter, Rodeo, Ipython etc…)
  • Custom Environment Settings
  • Concept of Packages/Libraries - Important packages (NumPy, SciPy, scikit-learn, Pandas, Matplotlib, etc)
  • Installing & loading Packages & Name Spaces
  • Data Types & Data objects/structures (Tuples, Lists, Dictionaries)
  • List and Dictionary Comprehensions
  • Variable & Value Labels – Date & Time Values
  • Basic Operations - Mathematical - string - date
  • Reading and writing data
  • Simple plotting
  • Control flow
  • Debugging
  • Code profiling

Python: Accessing/Importing and Exporting Data

  • Importing Data from various sources (CSV, Txt, Excel, Access etc…)
  • Database Input (Connecting to database)
  • Viewing Data objects - subsetting, methods
  • Exporting Data to various formats

Python: Data Manipulation - Cleansing

  • Cleansing Data with Python
  • Data Manipulation steps (Sorting, filtering, duplicates, merging, appending, subsetting, derived variables, sampling, Data type conversations, renaming, formatting etc)
  • Data manipulation tools (Operators, Functions, Packages, control structures, Loops, arrays etc)
  • Python Built-in Functions (Text, numeric, date, utility functions)
  • Python User Defined Functions
  • Stripping out extraneous information
  • Normalizing data
  • Formatting data
  • Important Python Packages for data manipulation (Pandas, Numpy etc)

Python: Data Analysis - Visualization

  • Introduction exploratory data analysis
  • Descriptive statistics, Frequency Tables and summarization
  • Univariate Analysis (Distribution of data & Graphical Analysis)
  • Bivariate Analysis (Cross Tabs, Distributions & Relationships, Graphical Analysis)
  • Creating Graphs- Bar/pie/line chart/histogram/boxplot/scatter/density etc)
  • Important Packages for Exploratory Analysis (NumPy Arrays, Matplotlib, Pandas and scipy.stats etc)

Python: Basic statistics

  • Basic Statistics - Measures of Central Tendencies and Variance
  • Building blocks - Probability Distributions - Normal distribution - Central Limit Theorem
  • Inferential Statistics -Sampling - Concept of Hypothesis Testing
  • Statistical Methods - Z/t-tests (One sample, independent, paired), Anova, Correlations and Chi-square

Python: Polyglot programming

  • Making Python talk to other languages and database systems
  • How do R and Python play with each other, why it's essential to know both

Python: Machine Learning -Predictive Modeling – Basics

  • Introduction to Machine Learning & Predictive Modeling
  • Types of Business problems - Mapping of Techniques
  • Major Classes of Learning Algorithms -Supervised vs Unsupervised Learning
  • Different Phases of Predictive Modeling (Data Pre-processing, Sampling, Model Building, Validation)
  • Overfitting (Bias-Variance Trade off) & Performance Metrics
  • Types of validation(Bootstrapping, K-Fold validation etc)

Python: Machine Learning in Practice

  • Linear Regression
  • Logistic Regression
  • Segmentation - Cluster Analysis (K-Means)
  • Decision Trees (CHAID/CART/CD 5.0)
  • Artificial Neural Networks (ANN)
  • Support Vector Machines (SVM)
  • Ensemble Learning (Random Forest, Bagging & boosting)
  • Other Techniques (KNN, Naïve Bayes, LDA/QDA etc)
  • Important Packages for Machine Learning (Sci Kit Learn, scipy.stats etc)

Introduction to Big Data

  • Introduction and Relevance
  • Uses of Big Data analytics in various industries like Telecom, E- commerce, Finance and Insurance etc.
  • Problems with Traditional Large-Scale Systems

Hadoop(Big Data) Eco-System

  • Motivation for Hadoop
  • Different types of projects by Apache
  • Role of projects in the Hadoop Ecosystem
  • Key technology foundations required for Big Data
  • Limitations and Solutions of existing Data Analytics Architecture
  • Comparison of traditional data management systems with Big Data management systems
  • Evaluate key framework requirements for Big Data analytics
  • Hadoop Ecosystem & Hadoop 2.x core components
  • Explain the relevance of real-time data
  • Explain how to use Big Data and real-time data as a Business planning tool

Hadoop cluster-Architecture-Configuration files

  • Hadoop Master-Slave Architecture
  • The Hadoop Distributed File System - Concept of data storage
  • Explain different types of cluster setups(Fully distributed/Pseudo etc)
  • Hadoop cluster set up - Installation
  • Hadoop 2.x Cluster Architecture
  • A Typical enterprise cluster – Hadoop Cluster Modes
  • Understanding cluster management tools like Cloudera manager/Apache ambari

Hadoop-HDFS & MapReduce (YARN)

  • HDFS Overview & Data storage in HDFS
  • Get the data into Hadoop from local machine(Data Loading Techniques) - vice versa
  • Map Reduce Overview (Traditional way Vs. MapReduce way)
  • Concept of Mapper & Reducer
  • Understanding MapReduce program Framework
  • Develop MapReduce Program using Java (Basic)
  • Develop MapReduce program with streaming API) (Basic)

Data Integration using Sqoop & Flume

  • Integrating Hadoop into an Existing Enterprise
  • Loading Data from an RDBMS into HDFS by Using Sqoop
  • Managing Real-Time Data Using Flume
  • Accessing HDFS from Legacy Systems

Data Analysis using Pig

  • Introduction to Data Analysis Tools
  • Apache PIG - MapReduce Vs Pig, Pig Use Cases
  • PIG’s Data Model
  • PIG Streaming
  • Pig Latin Program & Execution
  • Pig Latin : Relational Operators, File Loaders, Group Operator, COGROUP Operator, Joins and COGROUP, Union, Diagnostic Operators, Pig UDF
  • Writing JAVA UDF’s
  • Embedded PIG in JAVA
  • PIG Macros
  • Parameter Substitution
  • Use Pig to automate the design and implementation of MapReduce applications
  • Use Pig to apply structure to unstructured Big Data

Data Analysis using Hive

  • Apache Hive - Hive Vs. PIG - Hive Use Cases
  • Discuss the Hive data storage principle
  • Explain the File formats and Records formats supported by the Hive environment
  • Perform operations with data in Hive
  • Hive QL: Joining Tables, Dynamic Partitioning, Custom Map/Reduce Scripts
  • Hive Script, Hive UDF
  • Hive Persistence formats
  • Loading data in Hive - Methods
  • Serialization & Deserialization
  • Handling Text data using Hive
  • Integrating external BI tools with Hadoop Hive

Data Analysis using Impala

  • Impala & Architecture
  • How Impala executes Queries and its importance
  • Hive vs. PIG vs. Impala
  • Extending Impala with User Defined functions

Introduction to other Ecosystem tools

  • NoSQL database - Hbase
  • Introduction Oozie

Spark: Introduction

  • Introduction to Apache Spark
  • Streaming Data Vs. In Memory Data
  • Map Reduce Vs. Spark
  • Modes of Spark
  • Spark Installation Demo
  • Overview of Spark on a cluster
  • Spark Standalone Cluster

Spark: Spark in Practice

  • Invoking Spark Shell
  • Creating the Spark Context
  • Loading a File in Shell
  • Performing Some Basic Operations on Files in Spark Shell
  • Caching Overview
  • Distributed Persistence
  • Spark Streaming Overview(Example: Streaming Word Count)

Spark: Spark meets Hive

  • Analyze Hive and Spark SQL Architecture
  • Analyze Spark SQL
  • Context in Spark SQL
  • Implement a sample example for Spark SQL
  • Integrating hive and Spark SQL
  • Support for JSON and Parquet File Formats Implement Data Visualization in Spark
  • Loading of Data
  • Hive Queries through Spark
  • Performance Tuning Tips in Spark
  • Shared Variables: Broadcast Variables & Accumulators

Spark Streaming

  • Extract and analyze the data from twitter using Spark streaming
  • Comparison of Spark and Storm – Overview

Spark GraphX

  • Overview of GraphX module in spark
  • Creating graphs with GraphX

Introduction to Machine Learning using Spark

  • Understand Machine learning framework
  • Implement some of the ML algorithms using Spark MLLib

Project

  • Consolidate all the learnings
  • Working on Big Data Project by integrating various key components

Key Drivers for Customer Spending

Objective: The objective of the case study is to provide end to end steps to build and validate regression model to identify the key drivers of customer spend using Python-Spark. Problem Statement: One of the leading banks would like to identify key drivers for customer spending so that they can define strategy to optimize the product features.

Predicting bad customers(Default customer) using Credit customer application data

Objective: The objective of the case study is to provide end to end steps to build and validate classification model using python-spark Problem Statement: One of the leading banks would like to predict bad customers (Defaulters) based on the customer data provided by them in their application

Telecom Customer Segmentation

Objective: The objective of the case study to apply advanced algorithms like factor and cluster analysis for data reduction and customer segmentation based on the customer behavioural data Problem Statement: Build an enriched customer segmentation and profile them using different KPIs for one of the leading telecom company to define marketing strategy

Air Passengers Forecasting

Objective: The objective of the case study to given hands-on experience on how to apply/use different time series forecasting techniques (Averages/Smoothening, decomposition, ARIMA etc) Problem Statement: One of the leading travel companies would like predict number of air passengers travelling to Europe so that they can define their marketing strategy accordingly

Access to 105 hours instructor led live classes of 35x3 hours each, spread over 18 weekends

Video recordings of the class sessions for self study purpose

Weekly assignment, reference codes and study material in PDF format

Module wise case studies/ projects

Career guidance and career support post the completion of some selected assignments and case studies

What if I miss a class?

Don’t worry. You will always get a recording for the class in your inbox. Have a look at that and reach out to the faculty in case of doubts. All our live classes are recorded for self-study purpose and future reference, and these can also be accessed through our Learning Management System. Hence, in case you miss a class, you can refer to the video recording and then reach out to the faculty during their doubts clearing time or ask your question in the beginning of the subsequent class.

You can also repeat any class you want in the next one year after your course completion.

For how long are the recordings available to me?

One year post your course completion. If needed, you can also repeat any number of classes you want in the next one year after course completion.

Virtually the recordings are available to you for lifetime, but for judicious use of IT resources, the access to these recordings get deactivated post one year, which can be extended upon requests.

Can I download the recordings?

No. Our recordings can be accessed through your account on LMS or stream them live online at any point of time though.

Recordings are integral part of AnalytixLabs intellectual property by Suo Jure. The downloading/distribution of these recordings in anyway is strictly prohibited and illegal as they are protected under copyright act. Incase a student is found doing the same, it will lead to an immediate and permanent suspension in the services, access to all the learning resources will be blocked, course fee will be forfeited and the institute will have all the rights to take strict legal action against the individual.

What if I share my LMS login details with a friend?

The sharing of LMS login credentials is unauthorized, and as a security measure, if the LMS is accessed by multiple places, it will flag in the system and your access to LMS can be terminated.

Will I get a certificate in the end?

Yes. All our course are certified. As part of the course, students get weekly assignments and module-wise case studies. Once all your submissions are received and evaluated, the certificate shall be awarded.

Do you help in placements?

We follow a comprehensive and a self-sustaining system to help our students with placements. This is a win-win situation for our candidates and corporate clients. As a pre-requisite for learning validation, candidates are required to submit the case studies and project work provided as a part of the course (flexible deadline). Support from our side is continuous and encompasses help in profile building, CV referrals through our ex-students, HR consultants and companies directly reaching out to us.

We will provide guidance to you in terms of what are the right profiles for you based on your education and experience, interview preparation and conducting mock interviews, if required. The placement process for us doesn’t end at a definite time post your course completion, but is a long relationship that we will like to build.

Do you guarantee placements?

No institute can guarantee placements, unless they are doing so as a marketing gimmick! It is on a best effort basis.

In professional environment, it is not feasible for any institute to do so, except for a marketing gimmick. For us, it is on a best effort basis but not time – bound – in some cases students reach out to us even after 3 years for career support.

Do you have a classroom option?

Yes we have classroom option for Delhi-NCR candidates. However, most of our students end up doing instructor led live online classes, including those who join classroom in the beginning. Based on the student feedback, the learning experience is same both in classroom and instructor led live online fully interactive mode.

How do I attend the online classes? Are they interactive or self-paced?

We provide both the options and for instructor led live online classes we use the gold standard platform used by the top universities across the globe. These video sessions are fully interactive and students can chat or even ask their questions verbally over the VoIP in real time to get their doubts cleared.

What do I need to attend the online classes?

To attend the online classes, all you need is a laptop/PC with a basic internet connection. Students have often shared good feedback of attending these live classes through their data card or even their mobile 3G connection, though we recommend a basic broadband connection.

For best user experience, a mic-headphone is recommended to enhance the voice quality, though the laptop’s in-built mic works fine and you can ask your question over the chat as well.

How can I reach out to someone if I have doubts post class?

Through the LMS, students can always connect with the trainer or even schedule one-to-one time over the phone or online. During the course we also schedule periodic doubts-clearing classes though students can also ask doubts of a class in the subsequent class.

LMS also has a discussion forum where a lot of your doubts might get easily answered.

Incase you are having a problem still, repeat the class and schedule one-to-one time with the trainer.

What is your refund policy?

  • Instructor Led Live online or Classroom - Within 7 days of registartion date and latest 3 days before batch start
  • Video-based - 2 days

Can I pay in installments?

Yes. While making the fee payment, most of the courses have the installment option.

I am having difficulty coping up with my classes. What can I do?

For all the courses, we also provide the recordings of each class for their self-reference as well as revision in case you miss any concept in the class. In case you still have doubts after revising through the recordings, you can also take one-to-one time with the faculty outside classes during. Furthermore, if students want to break their courses in different modules, they get one year time to repeat any of the classes with other batches.

What are the system requirements for the software?

It is recommended to have 64-bit operating system with minimum 8GB RAM so that the virtual lab can be installed easily

Analytics is a growing field and this certification will definitely help you make it big in this field.The SAS business analytics program at AnalytixLabs has helped me learn the business intelligence s/w and techniques which you don't get to learn in your undergrad or anywhere else.

- Sahil Arora (Analyst, Amazon)

Have Questions?
Contact us and we shall
get back with answers.

Change the course of your career

Over 6000 learners and hundreds making right choice every month!
Course Brochure
Student Reviews
Upcoming Batches