European Proceedings Logo

Big Data Analytics And Firm Innovativeness: The Moderating Effect Of Data-Driven Culture

Table 1:

Resource Definition
Kaur and Sood, (2017, p.1) “A collection of huge volumes of diverse types of structured and unstructured data that cannot be handled by state‐of‐the‐art data processing platforms”
Dumbill, (2013, p.1) “data that exceeds the processing capacity of conventional database systems. The data is too big, moves too fast, or doesn’t fit the structures of your database architectures. To gain value from this data, you must choose an alternative way to process it”
Bayer and Laney, (2012) “High volume, velocity and variety information assets that demand cost-effective, innovative forms of information processing for enhanced insight anddecision-making”
Havens et al., (2012, p.1130) “data that you cannot load into your computer’s working memory”
Manyika et al., (2011, p.1) “datasets whose size is beyond the ability of typical database software tools to capture, store, manage, and analyse”
Microsoft, (2013) “The process of applying serious computing power, the latest in machine learning and artificial intelligence, to seriously massive and often highly complex sets of information“
Boyd and Crawford, (2012, p.663) “cultural, technological, and scholarly phenomenon that rests on the interplay of technology, analysis and mythology”
Fisher et al., (2012, p.53) “Data that cannot be handled and processed in a straightforward manner”
Intel, 2012 “Complex, unstructured or large amounts of data”
Morabito, (2015, p.viii) “dubbed to indicate the challenges associated with the emergence of data sets whose size and complexity require companies to adopt new tools and models for the management of information”
De Mauro et al., (2016, p.131) “a is the Information asset characterised by such a High Volume, Velocity and Variety to require specific Technology and Analytical Methods for its transformation into Value”
< Back to article