From Knoesis wiki
Revision as of 15:20, 20 October 2015 by Jibril (Talk | contribs) (Suggested Pre-Requisite Knowledge)

Jump to: navigation, search

Semantics Approach to Big Data and Event Processing


Variety, Velocity, Volume and Veracity are the four Vs for Big Data. Most of the technologies available have shown how to treat the Volume. However, due to the increasing number of streaming data sources, the Velocity problem is as relevant as never before. Moreover, Veracity and especially Variety problems have increased the difficulty of the challenge.

This course focuses on two aspects of the Big Data problem, Velocity and Variety, and it shows how with streaming data and semantic technologies it is possible to enable efficient and effective stream processing for advanced application development.

Covered Topics

  • Big Data Introduction with focus on textual and sensor streaming data
  • Deep into Velocity:
    • History and principles of stream computing and complex event processing
    • Modelling frameworks for complex event processing and data stream management
    • Processing social data streams
    • Physical (sensor/IoT)
  • Deep into Variety:
    • Semantic Web standards, languages, models, query processing, and reasoning techniques relevant to various modalities and integrated processing.
  • Variety and Velocity at once: Stream reasoning, Tools and applications


Students will gain enough background to use tools made available by the academic and industrial community to solve prototypical problems. Example applications include: Smart Cities, Social Media Analytics, Sensor Networks, Situational Awareness, Digital Health, etc.

Suggested Pre-Requisite Knowledge

Intermediate IT level (programming or application development skills, knowledge of data management and Web technologies).

Detailed Program

DAY 1: 6th October 2015

  • Module 1: Emanuele Della Valle: Introduction on Big Data with an emphasis on the Velocity and Variety dimensions. [45 min]

  • Module 2: Amit Sheth: Mastering the variety dimension of Big Data with semantic technologies; high level intro to standards. Focus on variety/interoperability dimension. [45 min]

  • Module 3: Emanuele Della Valle: Walk-through Semantic technologies: specific examples involving RDF, SPARQL, OWL, and R2ML. [30 min]

  • Module 4: Pramod Anantharam: Examples of applied semantic technologies to solve variety: SSN annotation. [30 min]

  • Module 5: Amit Sheth: Examples of applied semantic technologies to solve variety: Social data annotation. [30 min]

DAY 2: 7th October 2015

  • Module 6: Emanuele Della Valle: Mastering the velocity dimension of Big Data with stream processing technologies. [75 min]

  • Module 7: Riccardo Tommasini: Walk-through on stream processing technologies. [45 min] [1] [2] [3] [4]

  • Module 8: Amit Sheth Examples of applied stream processing technologies to solve velocity: Twitris. Specific examples of velocity challenge and how it is addressed in disaster coordination scenario (e.g., Jammu&Kashmir Floods). [60 min]

DAY 3: 8th October 2015

  • Module 9: Emanuele Della Valle: Stream Reasoning: mastering the velocity and variety dimension of Big Data at the same time. [60 min]

  • Module 10: Riccardo Tommasini: Hands on stream reasoning technologies. [30 min]

  • Module 11: Pavan Kapanipathi: Semantic Filtering as an example of Semantic technologies for real time analysis. [30 min]

  • Module 12: Pramod Anantharam: Event Correlation Social and IoT as an example of data integration. [30 min]

  • Module 13: Emanuele Della Valle: City Sensing Listening to the pulse of our cities fusing Social Media Streams and Call Data Records. [30 min]