Excavate Data for Decision, Horton Sandbox: HDP & HDF
Hortonworks Sandbox for Hadoop Data Platform (HDP) and Horton works Data Flow (HDF) is a quick and easy personal desktop environment to get started learning, exploring, developing, testing and trying new features. This three hours workshop provides an overview of the Apache Hadoop Eco System. The HDF Sandbox is also a single node cluster for data flow and stream processing based on Apache NiFi, Apache Kafka and Apache Storm. Parton’s gain hands on experience through HDP sandbox basics, real world examples. The workshop covers the topics from visualization framework with end user capable of displaying analytics data at different geographic levels. The data sources will be from CSV or Spark.
Speakers & Instructors :
– J. Joshua Thomas, Ph.D. (Senior Lecturer – Computing), Department of Computing, KDU, Penang University College.
– Bahari Belaton, Ph.D., (Assoc. Prof. Dr.), School of Computer Sciences, Universiti Sains Malaysia.
Workshop date: 30th November 2017, time: 9am – 12pm, Venue: Universiti Kebangsaan Malaysia
This workshop is organised as part of the activities in the The 5th International Visual Informatics Conference 2017 (IVIC’17).