Italian Data Analytics Framework (DAF)

The Data & Analytics Framework (DAF) is an open source project developed in the context of the activities planned by the Italian Three-Year Plan for ICT in Public Administration 2017 - 2019, approved by the Italian Government in 2017.

The DAF project attempts to establish a central Chief Data Officer (CDO) for the Government and Public Administration. Its main goal is to promote data exchange among Italian Public Administrations (PAs), to support the diffusion of open data, and to enable data-driven policies. The framework is composed by three building blocks:

  • A Big Data Platform, to store in a unique repository the data of the PAs, implementing ingestion procedures to promote standardisation and therefore interoperability among them. It exposes functionalities common to the Hadoop ecosystem, a set of (micro) services designed to improve data governance and a number of end-user tools that have been integrated with them.
  • A Team of Data Experts (Data Scientists and Data Engineers), able to manage and evolve the platform and to provide support to PA on their analytics and data management activities in a consultancy fashion.
  • A Regulatory Framework, that institutionalizes this activity at government level, and gives the proper mandate to the PA that will manage the DAF, in compliance with privacy policy.

The DAF is designed to be easily re-usable in other countries and other application domains. It exposes the following data management and analytics functionalities:

  • Public Dataportal, a Web user interface providing:

    • a catalog of open-data datasets based on CKAN;
    • a content management system for data stories, which are a kind of blog post that integrates interactive charts (made using the DAF) with a narrative description of the analysis made;
    • community tools to collaborate and learn how to use the platform;
  • Private Dataportal, a web application with the following features:

    • a catalog of all datasets the user can access;
    • an ingestion form to govern (insert, edit, delete) datasets information and setup ingestion procedures;
    • data visualization and dashboard tools;
    • a data science notebook;
  • Hadoop Cluster with typical applications to centralize and store, manipulate and standardize and re-distribute data and insights;

  • Multi-tenant architecture, based on Kerberos and LDAP.

The DAF has been introduced within the Big Policy Canvas Workshop on the BDVA Froum 2018 in Vienna by Raffaele Lillo, former Italian Chief Data Officer. You can read more about the workshop here.

Type of content: Assets
Type of asset:
Platform / Portal
Open license availability
Yes