![Learning Journal](/img/default-banner.jpg)
- Видео 844
- Просмотров 7 174 097
Learning Journal
Индия
Добавлен 20 авг 2016
Learn more at www.scholarnest.com/
Best place to learn Data engineering, Bigdata, Apache Spark, Databricks, Apache Kafka, Confluent Cloud, AWS Cloud Computing, Azure Cloud, Google Cloud - Self-paced, Instructor-led, Certification courses, and practice tests.
SPARK
www.scholarnest.com/courses/spark-programming-in-python-for-beginners/
www.scholarnest.com/courses/spark-streaming-for-python-programmers/
www.scholarnest.com/courses/spark-programming-in-scala-for-beginners/
KAFKA
www.scholarnest.com/courses/apache-kafka-for-beginners/
www.scholarnest.com/courses/kafka-streams-master-class/
Find us on Udemy
Visy below link for our Udemy Courses
www.learningjournal.guru/courses/
Find us on Oreilly
www.oreilly.com/library/view/apache-kafka-for/9781800202054/
www.oreilly.com/videos/apache-kafka/9781800209343/
www.oreilly.com/videos/kafka-streams-with/9781801811422/
Best place to learn Data engineering, Bigdata, Apache Spark, Databricks, Apache Kafka, Confluent Cloud, AWS Cloud Computing, Azure Cloud, Google Cloud - Self-paced, Instructor-led, Certification courses, and practice tests.
SPARK
www.scholarnest.com/courses/spark-programming-in-python-for-beginners/
www.scholarnest.com/courses/spark-streaming-for-python-programmers/
www.scholarnest.com/courses/spark-programming-in-scala-for-beginners/
KAFKA
www.scholarnest.com/courses/apache-kafka-for-beginners/
www.scholarnest.com/courses/kafka-streams-master-class/
Find us on Udemy
Visy below link for our Udemy Courses
www.learningjournal.guru/courses/
Find us on Oreilly
www.oreilly.com/library/view/apache-kafka-for/9781800202054/
www.oreilly.com/videos/apache-kafka/9781800209343/
www.oreilly.com/videos/kafka-streams-with/9781801811422/
Apache Spark Performance Tuning | Scenario based interview question | Cluster Autoscaling
Get this course www.scholarnest.in/courses/apache-spark-performance-tuning
Please visit our website below for more courses and live instructor-led training.
Learn more at www.scholarnest.in/
Keep Learning and Keep Growing.
Please visit our website below for more courses and live instructor-led training.
Learn more at www.scholarnest.in/
Keep Learning and Keep Growing.
Просмотров: 819
Видео
Apache Spark Performance Tuning Course | Tuning Terabyte Join | Tuning large table joins
Просмотров 7392 месяца назад
This course is available exclusively on ScholarNest LMS Check the below link if you wish to get it or WhatsApp: 91-93534 65988 for discount offers. www.scholarnest.in/courses/apache-spark-performance-tuning Please visit our website below for more courses and live instructor-led training. Learn more at www.scholarnest.in/ Keep Learning and Keep Growing.
11 - Azure Databricks Platform Architecture
Просмотров 9192 месяца назад
This course is available on Udemy and Udemy for Business subscription. Check the below link if you wish to get it on Udemy. www.udemy.com/course/master-azure-databricks-for-data-engineers/?referralCode=6C313138590E175DAA6F Please visit our website below for more courses and live instructor-led training. Learn more at www.scholarnest.com/ Keep Learning and Keep Growing.
10 - Introduction to Databricks Workspace
Просмотров 6632 месяца назад
This course is available on Udemy and Udemy for Business subscription. Check the below link if you wish to get it on Udemy. www.udemy.com/course/master-azure-databricks-for-data-engineers/?referralCode=6C313138590E175DAA6F Please visit our website below for more courses and live instructor-led training. Learn more at www.scholarnest.com/ Keep Learning and Keep Growing.
9 - Creating Databricks Workspace Service
Просмотров 5563 месяца назад
This course is available on Udemy and Udemy for Business subscription. Check the below link if you wish to get it on Udemy. www.udemy.com/course/master-azure-databricks-for-data-engineers/?referralCode=6C313138590E175DAA6F Please visit our website below for more courses and live instructor-led training. Learn more at www.scholarnest.com/ Keep Learning and Keep Growing.
Apache Spark Performance Tuning on Databricks | Scenario based Spark performance tuning course
Просмотров 1,6 тыс.3 месяца назад
Get this course www.scholarnest.in/courses/apache-spark-performance-tuning Reach out to our WhatsApp No 91-93534 65988 for a discount coupon. SPARK COURSES www.scholarnest.com/courses/spark-programming-in-scala-for-beginners/ www.scholarnest.com/courses/spark-programming-in-python-for-beginners/ www.scholarnest.com/courses/spark-streaming-for-python-programmers/ www.scholarnest.com/courses/spar...
08 - Azure Portal Overview
Просмотров 3843 месяца назад
This course is available on Udemy and Udemy for Business subscription. Check the below link if you wish to get it on Udemy. www.udemy.com/course/master-azure-databricks-for-data-engineers/?referralCode=6C313138590E175DAA6F Please visit our website below for more courses and live instructor-led training. Learn more at www.scholarnest.com/ Keep Learning and Keep Growing.
07 - Creating Azure Account
Просмотров 3623 месяца назад
This course is available on Udemy and Udemy for Business subscription. Check the below link if you wish to get it on Udemy. www.udemy.com/course/master-azure-databricks-for-data-engineers/?referralCode=6C313138590E175DAA6F Please visit our website below for more courses and live instructor-led training. Learn more at www.scholarnest.com/ Keep Learning and Keep Growing.
06 - What will you learn in this section
Просмотров 4353 месяца назад
This course is available on Udemy and Udemy for Business subscription. Check the below link if you wish to get it on Udemy. www.udemy.com/course/master-azure-databricks-for-data-engineers/?referralCode=6C313138590E175DAA6F Please visit our website below for more courses and live instructor-led training. Learn more at www.scholarnest.com/ Keep Learning and Keep Growing.
05 - Introduction to Databricks Platform
Просмотров 8244 месяца назад
This course is available on Udemy and Udemy for Business subscription. Check the below link if you wish to get it on Udemy. www.udemy.com/course/master-azure-databricks-for-data-engineers/?referralCode=6C313138590E175DAA6F Please visit our website below for more courses and live instructor-led training. Learn more at www.scholarnest.com/ Keep Learning and Keep Growing.
04 - Why do we need Databricks for Apache Spark projects
Просмотров 1,1 тыс.4 месяца назад
This course is available on Udemy and Udemy for Business subscription. Check the below link if you wish to get it on Udemy. www.udemy.com/course/master-azure-databricks-for-data-engineers/?referralCode=6C313138590E175DAA6F Please visit our website below for more courses and live instructor-led training. Learn more at www.scholarnest.com/ Keep Learning and Keep Growing.
03 - Introduction to Data Engineering
Просмотров 9174 месяца назад
This course is available on Udemy and Udemy for Business subscription. Check the below link if you wish to get it on Udemy. www.udemy.com/course/master-azure-databricks-for-data-engineers/?referralCode=6C313138590E175DAA6F Please visit our website below for more courses and live instructor-led training. Learn more at www.scholarnest.com/ Keep Learning and Keep Growing.
02 - Course Prerequisites | Master Azure Databricks for Data Engineers
Просмотров 8564 месяца назад
This course is available on Udemy and Udemy for Business subscription. Check the below link if you wish to get it on Udemy. www.udemy.com/course/master-azure-databricks-for-data-engineers/?referralCode=6C313138590E175DAA6F Please visit our website below for more courses and live instructor-led training. Learn more at www.scholarnest.com/ Keep Learning and Keep Growing.
01 - Master Azure Databricks for Data Engineers | About the Course
Просмотров 2,5 тыс.4 месяца назад
This course is available on Udemy and Udemy for Business subscription. Check the below link if you wish to get it on Udemy. www.udemy.com/course/master-azure-databricks-for-data-engineers/?referralCode=6C313138590E175DAA6F Please visit our website below for more courses and live instructor-led training. Learn more at www.scholarnest.com/ Keep Learning and Keep Growing.
02 - Course Prerequisites
Просмотров 6094 месяца назад
How to get Source Code and Examples for this course? All the source code, examples, data, and other resources are available in a public GitHub repository. You can refer to it anytime using the Course GitHub Repository. github.com/LearningJournal/Apache-Spark-and-Databricks-Stream-Processing-in-Lakehouse You can also download it using the below link. Compressed Source Code Download Link github.c...
06 - Batch processing to stream processing
Просмотров 1,1 тыс.8 месяцев назад
06 - Batch processing to stream processing
07 - Your first application - Applying Best Practice
Просмотров 8768 месяцев назад
07 - Your first application - Applying Best Practice
08 - Your first streaming application Implementing Stream
Просмотров 6458 месяцев назад
08 - Your first streaming application Implementing Stream
05 - Working in Databricks Workspace
Просмотров 9488 месяцев назад
05 - Working in Databricks Workspace
04 - Setup your Databricks Community Cloud Environment
Просмотров 4608 месяцев назад
04 - Setup your Databricks Community Cloud Environment
SS10 Creating your first stream processing application Python
Просмотров 728 месяцев назад
SS10 Creating your first stream processing application Python
Creating your first stream processing application
Просмотров 5068 месяцев назад
Creating your first stream processing application
Streaming Sources Sinks and Output Mode
Просмотров 7288 месяцев назад
Streaming Sources Sinks and Output Mode
Wonderful explanation. I was studying data cloud in salesforce and they were mentioning this data format multiple time. I was clueless but I got clarity from your video. Thank you sir
Excellent vídeo, thank you Master
I have completed this course on udemy and highly recommend this course on udemy. It's very well explained and easy to understand.
Hello, Is this page updated ? Can we rely on this by becoming a member and stay updated ? If not, where do all your courses be updated? I took your PySpark course on Udemy. Though the beginning was really good, the later part of the course did not have a continuous flow. How do I enroll to your batch course ?
to watch all the videos of Databricks course playlist , if we subscribe 199/- or 399/- ?
such an engaging content you dont loose me for a second .. amazing explanation.. bless you brother.. in my language "SAADAA KHUSHBU"
Great .but follow up question for this by interviwever is s how do we take 4x memory per executor.
Spark reserved memory is 300 mb in size and executor memory should be atleast 1.5X times of the spark reserved memory, i.e. 450 mb, which is why we are taking executor memory per core as 4X, that sums up as 512mb per executor per core
Is there still a coupon to get this course for free?
Please provide prerequisites
How to certify
Awesome sir
Thanks man it worked.
Very simple and precise. Thank you
THANKS
is this course suitable for scala users or do we need to have python knowledge?
What if the cluster size is fixed? Also ,shouldn't we take into account per node constraint? For eg: what if the no. of cores in a node is 4?
very very good and valuable course.
In the last step, you did kinit , that pulled the tgt and then dev uer could list the files. At what point of time, the client interacted with TGS with this tgt?
The course is very well organized
Thank you so much. Well explained about Root user.
incredible, thanks
in last question each and every value you took was default only (128mb, 4, 512mb,5 cores) , so lets say the question is for 50 gb of data then still 3gb would be the answer?
ModuleNotFoundError: No module named 'pyspark.streaming.kafka' error using command spark-submit --packages org.apache.spark:spark-streaming-kafka-0-10_2.13:3.5.1 live_processing.py can you help please?
If no. of cores are 5 per executor, At shuffle time, by default it creates 200 partitions,how that 200 partitions will be created,if no of cores are less, because 1 partition will be stored on 1 core. Suppose, that My config is, 2 executor each with 5 core. Now, how it will create 200 partitions if I do a group by operation? There are 10 cores, and 200 partitions are required to store them, right? How is that possible?
You can set the no of partitions equal to no. of cores for maximum parallelism. ofcourse, u cannot create 200 partitions in this case
Hi , Thanks for the explanation. It really helps. In the above example let's say In right stream we are getting impressionId=4, and we didn't get matching events for id=4 on left stream for long time, Is it possible to get this record also inside foreachbatch() function before it gets dropped by spark?
Very well explained
That is an extradentary explanation, Thank you
Best video about this three abstractions
thank you for explaining i was looking for a start example to get what it is but videos were like explaining to some experts well i figured out to follow your steps , after running the code and done the ncat command i m getting errors and first one is: "chk-point-dir" any help
C:\kafka\bin\windows>kafka-console-producer.bat --topic test2 --broker-list localhost:9092 < ..\data\sample1.csv The system cannot find the path specified. how to fix this error
Insightful explanation. Thanks for the video.
Great job, Sir
beautifully explained...
Well spoken, nicely explained.
Sir you are best❤
I have something clear on my head now.. Thank you very much
I'm using MacOS, how to setup all this ??
Thank you for your lecture. I am following all of your course related to big data.
Any chance this course will come on undemy
No. This course is exclusively for ScholarNest platform
where can i enroll for or buy this snowflake course? thank you
Wonderful. Cleared a lot of doubt.
Kafka doesn't allow more than 2 consumers to read from the same partition to avoid the same message being read multiple times. Isn't this the case when 2 consumers listen to the dans partition ?
Thank you
thanks, it is great guide and simple at same time
When will more video come. Can you create playlist as well
What are the alternative if we dont to use while loop? I'm trying to use Spring XML based project and no Spring Boot. Any idea?
great job, nice explanation