Since big data solutions have the potential to generate previously unavailable insights and create valuable services, how could companies be motivated to share proprietary data with the public sector, while preserving security and privacy?
In 2001, the industry analyst Douglas Laney at Gartner described data management challenges along the three dimensions volumes, velocity and variety in the E-commerce branch. Volumes stands for the quite huge increase of volumes of data, Velocity for increased point-of-interaction speed and the pace of data generated by interactions and used to support interactions. Data Variety means variety of incompatible data formats, non-aligned data structures and inconsistent data semantics. This 3-V-model has been widely used attempting to define big data since this publication in 2001. 
The Oxford dictionary has defined the term Big Data as “extremely large data sets that may be analysed computationally to reveal patterns, trends, and associations, especially relating to human behaviour and interactions”. 
Viktor Mayer-Schönberger and Kenneth Cukier point to what can be done with the data and why its size matters in the way that it is “the ability of society to harness information in novel ways to produce useful insights or goods and services of significant value”. Nevertheless, they focus on potential risks e.g. in terms of privacy, predictions to punish people even before they acted or the abuse of data by people with bad intentions. 
 Laney, D. (2001), 3D Data Management: Controlling Data Volume, Velocity, and Variety, https://blogs.gartner.com/doug-laney/files/2012/01/ad949-3D-Data-Management-Controlling-Data-Volume-Velocity-and-Variety.pdf, retrieved February 20, 2018.
 Oxford Dictionary (2018), Big Data, https://en.oxforddictionaries.com/definition/big_data, retrieved February 20, 2018.
 Mayer-Schönberger, V., Cukier, K. (2013), Big Data: A Revolution That Will Transform How We Live, Work, and Think, Houghton Mifflin Harcourt, New York.
|Agenda Setting||Policy Design and Analysis||Policy Implementation||Policy Monitoring and Evaluation|
|Agriculture, Fisheries, Forestry & Foods|
|Economy & Finance|
|Education, Youth, Culture & Sport|
|Employment & Social Security|
|Environment & Energy|
|Foreign Affairs and Defence|
|Innovation, Science & Technology|
|Urban Planning & Transport|
|Institutional Questions / Internal Affairs|
With regard to your question, we have identiefied the trend Data Philanthropy which partly correspondes with the aspect you have mentioned in your comment. But data privacy and security are for sure issues which need to be solved for such an public private cooperation model.
Projects, like the H2020 Espresso project tackeling the topic by developing a smart city architecture with consideration of digital marketplaces as well as the possibility to develope and provide data related services and applications.
Big Data is indeed a crucial trend not only for the public, but also for the private sector. Keeping the focus on the public sector, do you think that public authorities really have the infrastructure and knowhow to take advantage of Big Data? Or are intermediate steps needed?
You are right, an appropriate infrastructure is indeed the basis to generate big data
Smart City approaches are for instance paving the way for integrated infrastructures. One of the most important guiding principles will be the orchestration of existing and emerging systems based on an open design paradigm.
Big Data is partly linked to the trend Predictive Analytics which we also identified in our research. It helps to improve policy making processes by providing evidence-based foundations for decisions. In our Knowledge Base we also collected a bunch of use cases that show how public administrations incorporate big data in their policy making processes.