Theoretical and methodological frameworks ArtCatalog emerges from the convergence of a series of intellectual frameworks, technological developments and epistemological requirements that together make it necessary to reconsider the logic that governs the processes involved in knowledge production.
1. On the one hand, reference must be made to Foucault’s theses on institutional discourses, the discursive nature of knowledge and its necessary interweaving within the textual, narrative and linguistic structures that make it possible and mediate its construction. These theses were widely developed in post-structuralist thinking and have been developed further still up to the present day, in an unceasing exercise of updating and renewal, by the multidisciplinary studies conducted in the English-speaking world known as Art History Writings (Carrier, 1993; Elkins, 1997; Preziosi, 1998; Baxandall, 2003; Grant and Rubin, 2012; among others). The Art History Writings studies reflect upon how art history is constructed and conceived through its writings, text structures and narrative genres. Thus, all these investigations – which have scarcely been developed in Spain to date (Rodríguez Ortega, 2013d) – are therefore an essential reference for our research work
2. Another important group of studies are those of Bruno Latour’s (Actor-Network Theory /or ANT), recently renamed as the actant-rhizome ontology, and which represent one of the fundamental intellectual frameworks of our project. Actor-Network Theory is one of the most disruptive theories to appear in recent decades as regards the way we understand the processes of knowledge production, since it includes the social, the institutional and the material (objects). For Latour, knowledge is no longer a series of ideas that can be submitted to hermeneutic analysis, and is now seen as the result of interactions established among heterogeneous actors (Latour, 1983, 1992, 1998, 2005) (Callon, Law, & Rip, 1986) (Callon, 1989). Actor-Network Theory has proved to be very useful to describe the complex relationships that come into being in technical-scientific knowledge networks (Echevarría & González, 2009). Yet it is also being used to re-establish thinking as a whole and is fostering a review of the theses on the ways in which society is constructed (Latour, 2005; Bogost, 2012, among others). In the specific field of Art History, although there are some previous sporadic studies that were conducted in the early 21st century(Hennion, 2001; Latour and Weibel, 2002), it is only recently gaining renewed interest, given its potential to model, analyse and understand cultural phenomena as a set of networks in which objects and subjects interact in a continuous process of redefinition, producing diverse meanings and senses depending on these interactions (Zell, 2011).
Actor-Network Theory is associated with the possibility of processing data that digital technology currently allows us to perform, since it uses analysis strategies to discover and model the actors involved and the networks that are set up among them. At this point we come to another important aspect of the research project that we propose: its full interweaving with scientific and academic studies of a digital nature (/digital scholarship/), and more specifically within the field of Digital Art History.
3. Digital Scholarship and Digital Art History. Although the field of the Digital Humanities has a long history, and the previous research carried out by a large part of the proposed team attests to this, in recent years the field known as Digital Art History in the English-speaking world has experienced a spectacular boom. This resulted in a progressive process of institutionalisation that includes a whole series of initiatives, such as seminars, research studies, specialised centres and departments, the creation of groups, university programmes, publications, etc. What we wish to stress here is that this rapid development has taken place once we have become aware of the need to integrate art-historical studies into the new processes of analysis and interpretative models linked to the digital paradigm (Zorich, 2012; Drucker, 2013; Cuno, 2013; Khole, 2013; Zurich Declaration, 2014; Fisher and Swartz, 2014, among many others).
4. Analysis strategies based on large datasets. Within the broad field of so-called digital scholarship, this project proposes the use of specific analysis strategies based on the computational processing of large datasets. In our opinion, such methods are proving to be feasible and of heuristic value in other projects (some of the members of which also belong to this research team), and are suitable for addressing the intellectual problems that we pose regarding the catalogue. What we find interesting about these strategies is not just their technological application, but the fact that they have a heuristic value in themselves, that is to say, they have the capacity to raise new questions and to formulate new knowledge based on the results obtained. Due to the sheer volume of data processed, these questions and knowledge are beyond the capacity of the human brain, and can only be tackled by applying algorithmic coefficients. Furthermore, these strategies entail important paradigms shifts, which also act as reference horizons for our project. In this regard, we establish two important groups of studies.
4.1. Studies on the theory of cultural networks and behaviour logics (Di Maggio, 2011; Sieck, W. R. et al., 2010). The analysis of large datasets by means of complex algorithms makes it possible to extract and visualise certain patterns of behaviour, that is, it allows us to discover the logics that govern the functioning of the phenomena under analysis. A deeper sense of these phenomena is thus obtained by accessing the basic forces that can explain how they work, which in our case are the catalogues and processes of knowledge production associated with them. The area of research that Lev Manovich (2009) has called Cultural Analytics is based on this approach, and has become a fundamental methodological gold standard.
4.2. Studies that investigate the intellectual problems of scale and complexity, which are key concepts in the new epistemological paradigm of the digital society, also associated with the processing of large datasets. Computational data processing systems, as well as graphical visualisation and mapping models, now enable us to obtain a comprehensive and global picture of certain phenomena. This change of perspective – of scale – towards the global and exhaustive is what Franco Moretti (2005, 2013) has called the passage from close reading, which has to date been the basis of the study of the humanities (analysis of a set of elements as representatives of certain cultural phenomena; analysis of some detail from the whole), to distant reading, which implies a global analysis of all the components that make up the phenomenon under analysis (analysis of the whole). The important thing about this approach is that it allows for the existence of something that had been unprecedented up until now: a multi-scale perspective. That is to say, at the same time that it affords us an overview of how things are organised and work, it also allows us to address unique, local, individual and/or marginal facts. This even includes those that have gone unnoticed by traditional historiographical studies because they are not part of the /mainstream /or of the phenomena considered relevant, but which can now be made visible thanks to the possibilities of /data mining/ to extract significant data, thus themselves becoming new produced knowledge (Page, 2011). It is precisely the possibility of analysing the same phenomenon according to different scales that allows us to deal with cultural complexity. Significant examples of this methodology are to be found in the analysis of large literary corpora (Jockers, 2013).
The question of scale is currently one of the most active intellectual debates in the group of studies within the framework of what is known as Global Art History. In general terms, the question is one of how the significance of an agent/cultural phenomenon changes according to whether it is analysed from a micro-local or macro-local perspective, what becomes visible and what becomes overshadowed in each case, and how art history is constructed within the framework of this dialectic tension, which can now be revealed and analysed. The problem of scale, usually associated with the analysis of the geospatial and geopolitical distribution of large datasets, is also giving rise to a review of the studies linked to the theoretical framework of cultural circulation and transfer, which are now analysed as problems of scale between the local, the national and the transnational.
A large number of studies that have appeared in recent years, which are evidencing the value of these analyses in generating new interpretations, will be very useful as previous references and methodological models in our project (Fletcher and Helmreich, 2013; Suárez, 2013; Caldas, Ortega et al., 2014; Joyeux-Prunel, 2009 and 2010; Dossin, 2014; Lombardi, 2014; among others).
4.3. With regard to specific analysis strategies, the combination of technologies such as data mining, network analysis and graph analysis (graphical and mathematical representation of a network of nodes) from the computer sciences, social sciences and statistical sciences, has proved to be a powerful instrument for the study of certain network and sub-network systems, becoming popular for the study of various cultural and social phenomena (Scott and Carrington, 2011).