The Blockchain technology - first got to known as the technology behind the cryptocurrency Bitcoin - is now perceived as a trend in itself with several application fields including the public sector. According to Gartner the technology is already beyond its peak of inflated expectations and will reach its plateau of productivity in 5 to 10 years.  That shows that there is a general hype about the technology but that the same is not sophisticated enough yet.
There are various definitions of the Internet of Things (IoT). The Internet Engineering Task Force says Internet of Things’ basic idea is to connect electronical and non-electronical objects to provide seamless communication and contextual services by them through e.g. RFID tags, sensors, actuators or mobile phone. The latter is related to the term “things”. The term “Internet” considers the TCP/IP suite and non-TCP/IP suite at the same time. 
A new computing model – edge computing – is currently evolving, which involves extending data processing to the edge of a network in addition to computing in a cloud or a central data centre. 
The term “smart” in this trend indicates, that it is not only about collecting and storing data, but about automation of data analyses. Kim et al. describe this as follows: “Smart surveillance system is mainly composed of automatic video/audio analysis. Therefore, an emerging surveillance system must consider multimedia information for monitoring activities and extracting meaningful information from the environment.”
Cloud Computing is a model that enables ubiquitous access to a shared pool of configurable technological assets available on-demand in a virtualised environment. Cloud services are remotely managed by cloud service providers and can be rapidly provisioned and released with minimal effort or service provider interaction. It can potentially achieve coherence and economies of scale.
The cloud model encompasses the four deployment models Public, Private, Hybrid and Community and the following three delivery models :
Software as a Service
Predictive analytics brings together advanced analytics capabilities. It extracts information from existing data sets in order to determine patterns and predict future impacts and trends. It forecasts what might happen in the future with an acceptable level of reliability, and includes what-if scenarios and risk assessments.
Data analytics encompasses techniques such as regression analysis, pattern matching, forecasting, multivariate statistics, predictive modelling and forecasting. 
Gartner assumes that by 2020 modern BI and analytics platform components will deliver smart, governed, search- and visual-based data discovery capabilities. Natural-language generation and artificial intelligence will be a standard feature of 90% of modern BI platforms and organisations that offer users access to a curated catalogue of internal and external data will realise twice the business value from analytics investments than those that do not. Gartner outlined fifteen critical capabilities by a BI and Analytics Platform :
By now, the most promising application of artificial intelligence is the use of machine learning as a subfield of AI. The Encyclopaedia Britannica states that machine learning is concerned with the implementation of computer software that can learn autonomously. 
Artificial Intelligence (AI) is a possibility to improve policy and decision making and can be understood as the automation of intelligent and human-like behaviour. The most important techniques to support specific cases of indeed high complex policy making processes are decision support and optimisation techniques, game theory, data and opinion mining, agent-based simulation and visual scenario based evaluations. 
In 2001, the industry analyst Douglas Laney at Gartner described data management challenges along the three dimensions volumes, velocity and variety in the E-commerce branch. Volumes stands for the quite huge increase of volumes of data, Velocity for increased point-of-interaction speed and the pace of data generated by interactions and used to support interactions. Data Variety means variety of incompatible data formats, non-aligned data structures and inconsistent data semantics. This 3-V-model has been widely used attempting to define big data since this publication in 2001.