100% satisfaction guarantee Immediately available after payment Both online and in PDF No strings attached
logo-home
Data Science - Hadoop Ecosystem & Security $10.49   Add to cart

Exam (elaborations)

Data Science - Hadoop Ecosystem & Security

 3 views  0 purchase
  • Course
  • Data Science - Hadoop Ecosystem & Security
  • Institution
  • Data Science - Hadoop Ecosystem & Security

what is Big Data/ 4 V's of Big Data - answer-a collection of data sets so large and complex that your legacy IT systems cannot handle them. (Terabytes, petabytes, exabytes of data). Data is considered 'Big Data' if it satisfies the v's Volume - size/scale of data Variety - of data, data is of...

[Show more]

Preview 3 out of 23  pages

  • September 6, 2024
  • 23
  • 2024/2025
  • Exam (elaborations)
  • Questions & answers
  • Data Science - Hadoop Ecosystem & Security
  • Data Science - Hadoop Ecosystem & Security
avatar-seller
TOPDOCTOR
Data Science - Hadoop Ecosystem & Security
what is Big Data/ 4 V's of Big Data - answer-a collection of data sets so large and
complex that your legacy IT systems cannot handle them. (Terabytes, petabytes,
exabytes of data). Data is considered 'Big Data' if it satisfies the v's

Volume - size/scale of data

Variety - of data, data is often unstructured or semi-structured. The different forms of
data

Velocity - speed of processing data

Veracity - (extra added by IBM) uncertainty of the quality of data; analysis of streaming
data

what is velocity? - answer-speed of data processing

what is geodata? - answer-geologic data, like GPS coordinates

what is graph data structure? - answer-connected nodes with relations between them,
like friends on Facebook

what model/algorithm is not used in data science? - answer-graph detection

What license type is the Hadoop core using? - answer-Apache license

Projects under the apache umbrella are released under the apache license

where was Hadoop invented? - answer-Yahoo!

Hadoop was invented at Yahoo! and inspired by Google's GFS (Google File System) and
Google's MapReduce papers

select the correct ansewr - answer-Hadoop scales horizontally and will have a lower cost
per GB when storing a lot of data

which Hadoop distribution is fully open source? - answer-Apache Hadoop and
Hortonworks

what is the management frontend in Hortonworks called? - answer-Ambari

What application / service stores the data on the hadoop cluster? - answer-HDFS

Is the Namenode running on a master node or on worker node? - answer-Master Node

Is the NodeManager running on a master node or on worker node? - answer-worker node

,Vagrant - answer-can create and configure virtual development environments
- lightweight
- reproducible
- portable
- wrapper around VirtualBox, KVM, AWS, Docker, VMWare, Hyper-V

create identical dev environments for operations and developers

disposable environments

to launch a 3 node cluster - answer-cmd -> folder where vagrant is downloaded -> type
$ vagrant up node1 node2 node3

my case: c:\HashiCorp\Vagrant>

* my experience though, go to the project folder you create, type: $ vagrant init
then the above

suspend machines - answer-(when you don't need it anymore, or unsuspend when you
want to use again)

$ vagrant suspend

remove machines and all data - answer-$ vagrant destroy

vagrantfile - answer-- a project has a vagrantfile with the configuration
- it marks the root directory of your project
- it describes the kind of machine and resources needed for the project to run and what
software to install

to change network settings, memory or CPU, open the Vagrantfile in notepad (or equiv)
and make changes

Ambari-server - answer-on node 1 it installs and configures all the nodes

things to install - answer-VirtualBox

Vagrant

get file from web in cmd - answer-wget http://github.com/wardviaene/hado
op-ops-course/archive/master.zip

unzip file in cmd - answer-sudo apt install unzip
unzip master.zip

HDFS (Hadoop Distributed File System) - answer-when data is uploaded here, it is
divided over small blocks and distributed over the cluster; the data part of Hadoop;
its server on a typical machine is called a DataNode

, *to upload file to hdfs
hadoop fs -put
- puts metadata in NameNode, sends lists of IP adresses of DataNodes
-each DataNode holds copies of data (3x by default)

- hadoop interacts with this with shell commands
- data transfer rate is very high
- infinitely scalable
- fits well with replication to provide fault tolerance and availability

DataNode - answer-sends blockrepot & heartbeats to the NameNode. These hold
copies/backups of the information being stored

How to upload file to HDFS - answer-cmd -> project folder -> $ vagrant ssh node1 [logs
into node1] -> hadoop fs -put /vagrant/data/salaries/csv salaries.csv

check size/block amount of file - answer-ls -ahl /vagrant/data/salaries.csv

*outputs something like:
1 vagrant vagrant 16M Mar 4 17:09 /vagrant/data/salaries/csv

where 16M is 16 megabytes, too small to separate into multiple blocks

look at more details of node - answer-$ hdfs fsck /user/vagrant/salaries.csv

this tells you more info, including how many blocks it contains

Hadoop - answer-the framework to process and analyze Big Data; a framework that
allows for the distributed processing of large data sets across clusters of computers
using simple programming model

- supports the huge volume of data
- stores data efficiently and reliably
- data loss is unavoidable. the proposed solution gives good recovery strategies
- solution should be horizontally scalable as the data grows
- should be cost-effective
- minimize the learning code. It should be easy for the programmer and non-
programmers
- scalable (same program runs on 1, 1000, or 4000 machines); scales linearly
- simple APIs
- petabytes of data

- high availability
- scalability
- fault tolerant
- economic

set of Open Source Projects

The benefits of buying summaries with Stuvia:

Guaranteed quality through customer reviews

Guaranteed quality through customer reviews

Stuvia customers have reviewed more than 700,000 summaries. This how you know that you are buying the best documents.

Quick and easy check-out

Quick and easy check-out

You can quickly pay through credit card or Stuvia-credit for the summaries. There is no membership needed.

Focus on what matters

Focus on what matters

Your fellow students write the study notes themselves, which is why the documents are always reliable and up-to-date. This ensures you quickly get to the core!

Frequently asked questions

What do I get when I buy this document?

You get a PDF, available immediately after your purchase. The purchased document is accessible anytime, anywhere and indefinitely through your profile.

Satisfaction guarantee: how does it work?

Our satisfaction guarantee ensures that you always find a study document that suits you well. You fill out a form, and our customer service team takes care of the rest.

Who am I buying these notes from?

Stuvia is a marketplace, so you are not buying this document from us, but from seller TOPDOCTOR. Stuvia facilitates payment to the seller.

Will I be stuck with a subscription?

No, you only buy these notes for $10.49. You're not tied to anything after your purchase.

Can Stuvia be trusted?

4.6 stars on Google & Trustpilot (+1000 reviews)

67866 documents were sold in the last 30 days

Founded in 2010, the go-to place to buy study notes for 14 years now

Start selling
$10.49
  • (0)
  Add to cart