course-deatils-thumbnail

Big Data Architect Training

Overview

The Big Data Architect program is a 360° training program offered by Cognixia to professionals who are seeking to deepen their knowledge in the field of Big Data. This program is customized based on current industry standards that comprise the major sub-modules as a part of the training process. This program is designed by industry experts to provide hands-on training with tools that are used to speed up the training process.

The program is inclusive of full-fledged training courses on Java, Hadoop, MongoDB, Scala Programming, and Spark and Scala Development, all of which are found to be quite essential skills for Big Data Architects. These modules put together will provide a solid foundation and give a more competitive edge in the learning process.

This course is specifically targeted to passionate professionals who are willing to get promoted to the next level in Big Data Architecture and have already gained expertise at the basic level of Java Development.

Schedule Classes

Looking for more sessions of this class?

What You'll learn

Cognixia’s Big Data Architect Program will enable candidates to:

  • Get exposure to real-life projects that will help them create high-quality Java programs by developing and implementing the Java Framework.
  • Conduct full-fledged Hadoop development and implementation with excellence.
  • Load information from disparate data sets and translate complex, functional and required technicalities into detailed design.
  • Implement Hive & Pig, HBase, MapReduce Integration, and Advanced Indexing.
  • Learn essential NoSQL concepts and get acquainted with the query language, indexes, and MongoDB’s scalability and high-quality features.
  • Impart in-depth knowledge of Big Data processing using Hadoop and Spark environment with Spark and Scala module.

Overview of the Modules

A rigorous 126 hours of training would be given to candidates, wherein they would study the following five major modules and significant case studies at the end of the program.

  • Java Essentials

Under this module, candidates will learn about OOPS concepts, and core and advanced Java, Servlets, and JSP technology.

  • Big Data Hadoop Developer Program

This program is designed to educate candidates about Linux and Big Data Virtual Machines (VM). Candidates would be taught about the Hadoop Distributed File System (HDFS), its interface, features, and its application in fault tolerance. An overview of MapReduce (theoretical and practical), Hadoop Streaming (developing and debugging non-Java MR programs – Ruby and Python), Bulk Synchronous Parallel (BSP) – an alternative to MapReduce – and higher-level abstractions for MapReduce (Pig and Hive) are other interesting topics students would be learning under this module. In the case studies, candidates will be using Pig, HBase, Hive, and MapReduce to perform Big Data analytics learned in the course. The first case study on Twitter Analysis and the other one on Click-Stream Analysis would give a complete understanding of some interesting data analysis facts and concepts.

  • MongoDB

In this module, candidates will get an understanding of MongoDB, its installation, advantages, syntaxes, and queries. They will gain an understanding of how NoSQL suits Big Data needs. Apart from this, the course will also cover CRUD concepts, MongoDB security, and MongoDB administration activities. A hands-on Mongo DB project will show how to work with the MongoDB Java Driver and how to use MongoDB as a Java Developer.

  • Scala Programming

Like Java, Scala is an object-oriented programming language, and this module has been designed to impart an in-depth knowledge of programming in Scala. The module is based on the 60:40 ratio – 60% practical sessions and 40% theoretical classes. Candidates will learn about functional programming principles, exception handling, and XML manipulating in Scala.

  • Apache Spark and Scala Development Program

The objective of this program is to deliver a clear understanding of Apache Spark & Scala concepts. It will provide an overview of the Hadoop ecosystem, Hive, GraphX, and Spark Machine Learning libraries (Spark MLlib). Candidates will also learn Spark RDD, and how to write and deploy Spark applications.

Each of the modules would be followed by the practical assignments that are to be completed before the commencement of next class, to ensure candidates properly learn and clear all their doubts before moving ahead.

Who can take this training?

This course is designed for tech-savvy individuals who seek in-depth knowledge in the field of Big Data. Moreover, it offers promising benefits to fresher, experienced developers and architects, corporate IT professionals, engineers, and other professionals.

Duration: 126 Hours

Curriculum

Core Java

  • Features of Java
  • Java Basics
  • Classes and Objects
  • Garbage Collection
  • Java Arrays
  • Referring Java Documentation
  • Wrapper classes
  • Inheritance
  • Polymorphism
  • Abstract Classes
  • Interfaces
  • Packages
  • Introduction to Exception Handling
  • Checked/Unchecked Exceptions
  • Using try, catch, finally, throw, throws
  • Exception Propagation
  • Pre-defined Exceptions
  • User Defined Exceptions
  • Overview of Java IO Package
  • Byte Streams
  • Character Streams
  • Object Serialization & Object Externalization
  • Introduction to GUI Programming (Swing)
  • Introduction to Multithreading
  • Thread Lifecycle
  • Thread Priorities
  • Using wait() & notify()
  • DeadLocks
  • JDBC Architecture
  • Using JDBCI API
  • Transaction Management

Java Servlet Technology

  • What is a Servlet?
  • Servlet Life Cycle
  • Initializing a Servlet
  • Writing Service Methods
  • Getting Information from Requests
  • Constructing Responses
  • ServletContext and ServletConfig Parameters
  • Attributes – Context, Request and Session
  • Maintaining Client State – Cookies/URL rewriting/Hidden Form Fields
  • Session Management
  • Servlet Communication – include, forward, redirect
  • WEB-INF and the Deployment Descriptor

Java Server Pages Technology

  • What Is a JSP Page?
  • The Lifecycle of a JSP Page
  • Execution of a JSP Page
  • Different Types of Tags (directive, standard actions, bean tags, expressions, declarative)
  • Creating Static Content
  • Creating Dynamic Content
  • Using Implicit Objects within JSP Pages
  • JSP Scripting Elements
  • Including Content in a JSP Page
  • Transferring Control to Another Web Component – communication with servlet
  • Param Element
  • JavaBeans Component Design Conventions
  • Why Use a JavaBeans Component?
  • Creating and Using a JavaBeans Component
  • Setting JavaBeans Component Properties
  • Retrieving JavaBeans Component Properties
  • Custom Tags

Introduction to Scala

  • A brief history of the Java platform to date
  • Distinguishing between the Java language and platform
  • Pain points when using Java for software development
  • Possible criteria for an improved version of Java
  • How and why the Scala language was created?

Key Features of the Scala Language

    • Everything is an object
    • Class declarations
    • Data typing
    • Operators and methods
    • Pattern matching
    • Functions
    • Anonymous and nested functions
    • Traits

Basic Programming in Scala

    • Build in types, literals, and operators
    • Testing for equality of state and reference
    • Conditionals, simple matching and external iteration
    • Working with lists, arrays, sets and maps
    • Throwing and catching exceptions
    • Adding annotations to your code
    • Using standard Java libraries
    • Using Scala with in java application and vice-versa

OO Development in Scala

    • A minimal class declaration
    • Understanding primary constructors
    • Specifying alternative constructors
    • Declaring and overriding methods
    • Creating base classes and class hierarchies
    • Creating traits and mixing them into classes
    • How a Scala inheritance tree is linearized?

Functional Programming in Scala

    • Advanced uses of for expressions
    • Understanding function values and closures
    • Using closures to create internal iterators
    • Creating and using higher order functions
    • Practical examples of higher order functions
    • Currying and partially applied functions
    • Creating your own Domain Specific Languages(DSL’s)

Exception handling in Scala

Try catch with case

Pattern Matching in Depth

    • Using the match keyword to return a value
    • Using case classes for pattern matching
    • Adding pattern guards to match conditions
    • Partially specifying matches with wildcards
    • Deep matching using case constructors
    • Matching against collections of items
    • Using extractors instead of case classes

Test Driven Development in Scala

Writing standard JUnit tests in Scala

Conventional TDD using the ScalaTest tool

Behavior Driven Development using ScalaTest

Using functional concepts in TDD

    • XML Manipulating in Scala
      • Using Scala to read and write XML using different parsers (Dom, Sax)
      • Working with XML literals in code
      • Embedding XPath like expressions
      • Using Pattern Matching to process XML data
      • Serializing and de-serializing to and from XML
      • Scala with database transaction
    • Writing Concurrent Apps
      • Issues with conventional approaches to multi-threading
      • How an actor-based approach helps you write thread-safe code?
      • The Scala architecture for creating actor-based systems
      • Different coding styles supported by the actor model
  • Scala with JAXB
  • Scala to call/consume a REST/SOAP service
  • Scala with logging information
  • Using Scala in web application (JSP, Servlet)
  • Conclusion

Introduction

  • Introduction
  • Module Outline
  • What We Will Build
  • History of Play!
  • Philosophy
  • Technologies
  • Summary

Starting Up

  • Introduction
  • Downloading Play!
  • The Play Command
  • Compiling and Hot Deploy
  • Testing
  • IDE’s
  • Project Structure
  • Configuration
  • Error Handling
  • Summary

Routing

  • Introduction
  • The Router
  • Router Mechanics
  • Routing Rules
  • Play! Routes
  • Play! Routes: HTTP Verbs
  • Play! Routes: The Path
  • Play! Routes: The Action Call
  • Routing in Action
  • Summary

Controllers, Actions, and Results

  • Introduction
  • Controllers
  • Actions
  • Results
  • Session and Flash Scope
  • Request Object
  • Implementing the Contacts Stub Controller
  • Summary

Views

  • Introduction
  • Play! Views
  • Static Views
  • Passing Arguments
  • Iteration
  • Conditionals
  • Partials and Layouts
  • Accessing the Session Object
  • The Asset Route
  • Summary

Data Access

  • Introduction
  • Agnostic Data Access
  • The Domain Model
  • Evolutions
  • Finder and Listing Contacts
  • The Form Object and Adding a Contact
  • Editing a Contact
  • Deleting a Contact
  • Review
  • Summary

The Global Object

  • Introduction
  • The Global Object
  • Global Object Methods
  • onStart
  • onHandlerNotFound
  • Summary

Prerequisites

Our industry experts would give candidates the required information to be a Big Data Architect. With that said, there are no pre-requisites as such for learning this program. However, knowledge of basic programming concepts will be beneficial, but certainly not a mandate.

Reach out to us for more information

Interested in this course? Let’s connect!

  • This field is for validation purposes and should be left unchanged.

Course features

Course Duration
Course Duration

36 hours of live, online, instructor-led training

24x7 Support
24x7 Support

Technical & query support round the clock

Lifetime LMS Access
Lifetime LMS Access

Access all the materials on LMS anytime, anywhere

Price Match Gurantee
Price match Gurantee

Guranteed best price aligning with quality of deliverables

FAQs

Certified Industry Experts/Subject Matter Experts with immense experience under their belt.

An internet speed of at least 2 Mbps is essential.

Candidates need not worry about losing any training session. They will be able to view the recorded sessions available on the LMS. We also have a technical support team to assist the candidates in case they have any query.

Access to the Learning Management System (LMS) will be for lifetime, which includes class recordings, presentations, sample code, and projects.