Data Engineering

Data Engineering

107 bookmarks
Custom sorting
How to install Apache Spark on Ubuntu using Apache Bigtop
How to install Apache Spark on Ubuntu using Apache Bigtop
Want to install Apache Spark using Apache Bigtop? Step by step tutorial. Bigtop is a package manager for Spark, HBase, Hadoop and other Apache projects related to big data. This tutorial is for Machine Learning engineers and Data Scientists looking for a convenient way to manage big data components of their ecosystem.
·blog.miz.space·
How to install Apache Spark on Ubuntu using Apache Bigtop
Hadoop ecosystem with docker-compose
Hadoop ecosystem with docker-compose
Description Construct Hadoop-ecosystem cluster composed of 1 master, 1 DB, and n of slaves, using docker-compose. Get experience of hadoop map-reduce routine and hive, sqoop, and hbase system, among the hadoop ecosystem.
·hjben.github.io·
Hadoop ecosystem with docker-compose
Hadoop Yarn Configuration on Cluster
Hadoop Yarn Configuration on Cluster
This post explains how to setup Yarn master on hadoop 3.1 cluster and run a map reduce program.Before you proceed this document, please make sure you have Hadoop3.1 cluster up and running. if you do not have a setup, please follow below link to setup your cluster and come back to this page.
·sparkbyexamples.com·
Hadoop Yarn Configuration on Cluster
Spark Step-by-Step Setup on Hadoop Yarn Cluster
Spark Step-by-Step Setup on Hadoop Yarn Cluster
This post explains how to setup Apache Spark and run Spark applications on the Hadoop with the Yarn cluster manager that is used to run spark examples as deployment mode client and master as yarn. You can also try running the Spark application in cluster mode. Prerequisites : If you don't have Hadoop & Yarn installed, please Install and Setup Hadoop cluster and setup Yarn on Cluster before proceeding with this article.. Spark Install and Setup In order to install and setup Apache Spark on Hadoop cluster, access Apache Spark Download site and go to the Download Apache Spark section
·sparkbyexamples.com·
Spark Step-by-Step Setup on Hadoop Yarn Cluster
Apache Hadoop Installation on Ubuntu (multi-node cluster).
Apache Hadoop Installation on Ubuntu (multi-node cluster).
Below are the steps of Apache Hadoop Installation on a Linux Ubuntu server, if you have a windows laptop with enough memory, you can create 4 virtual machines by using Oracle Virtual Box and install Ubuntu on these VM's. This article assumes you have Ubuntu OS running and doesn't explain how to create VM's and install Ubuntu. Apache Hadoop is an open-source distributed storing and processing framework that is used to execute large data sets on commodity hardware; Hadoop natively runs on Linux operating system, in this article I will explain step by step Apache Hadoop installation version (Hadoop 3.1.1)
·sparkbyexamples.com·
Apache Hadoop Installation on Ubuntu (multi-node cluster).
Multinode Hadoop installation steps - DBACLASS
Multinode Hadoop installation steps - DBACLASS
Multi Node Cluster in Hadoop 2.x Here, we are taking two machines – master and slave. On both the machines, a datanode will be running. Let us start with the setup of Multi Node Cluster in Hadoop. PREREQUISITES: Cent OS 6.5 Hadoop-2.7.3 JAVA 8 SSH We have two machines (master and slave) with IP: Master […]
·dbaclass.com·
Multinode Hadoop installation steps - DBACLASS
Hive - Installation
Hive - Installation
Hive - Installation, All Hadoop sub-projects such as Hive, Pig, and HBase support Linux operating system. Therefore, you need to install any Linux flavored OS. The following simple
·tutorialspoint.com·
Hive - Installation
Onehouse
Onehouse
·onehouse.ai·
Onehouse
How to Put a Database in Kubernetes - DZone Cloud
How to Put a Database in Kubernetes - DZone Cloud
Learn the key steps of deploying databases and stateful workloads in Kubernetes and meet cloud-native technologies that can streamline Apache Cassandra for K8s.
·dzone.com·
How to Put a Database in Kubernetes - DZone Cloud
10 Skills to Ace Your Data Engineering Interviews
10 Skills to Ace Your Data Engineering Interviews
Preparing for a data engineering interview and are overwhelmed by all the tools and concepts?. Then this post is for you, in this post we go over the most common tools and concepts you need to know to ace your data engineering interviews.
·startdataengineering.com·
10 Skills to Ace Your Data Engineering Interviews
Whats the difference between ETL & ELT?
Whats the difference between ETL & ELT?
This post goes over what the ETL and ELT data pipeline paradigms are. It tries to address the inconsistency in naming conventions and how to understand what they really mean. Finally ends with a comparison of the 2 paradigms and how to use these concepts to build efficient and scalable data pipelines.
·startdataengineering.com·
Whats the difference between ETL & ELT?