Preview

Sociology of Power

Advanced search

Neural Network Nebulaе: ‘Black Boxes’ of Technologies and Object-Lessons from the Opacities of Algorithms

https://doi.org/10.22394/2074-0492-2020-2-157-182

EDN: OJYPQZ

Abstract

The paper deals with the quandary of the neutrality and transparency of technologies. First, I show how this problem is connected with the image of the opening of 'black boxes' that is pivotal to much of science and technology studies. Second, methodological and socio-political dimensions of the 'black box' metaphor are discussed. Third, I analyze three typical solutions to the problem of the neutrality of technologies outside and inside constructivist technology studies. It is demonstrated that despite their apparent differences, these solutions are similar in their logic of conceptualizing technology as a neutral intermediary. Forth, I look for an alternative to this logic in the actor-network theory of Bruno Latour. Here technologies are conceived in terms of an eventful association of heterogeneous entities irreducible to its conditions of possibility. The construction of technologies is understood as mediation, or as a 'making-do' process where creators are surprised by their creations and vice versa. In Latour's actor-network, technologies are interpreted as opaque and non-neutral entities. Finally, I turn to some object-lessons from smart technologies powered by neural networks to demonstrate that these are empirical vindications of Latour's conception of technical mediation. Particular attention is paid to the opacity and (non)interpretability of machine learning algorithms.

About the Author

Andrei G. Kuznetsov
European University at Saint-Petersburg; ITMO University, Russia
Russian Federation

PhD in Sociology, Research Fellow, STS-Centre, European
University at Saint-Petersburg; Associate Professor, ITMO University. 



References

1. Ananny M., Crawford K. (2016) Seeing Without Knowing: Limitations of the

2. Transparency Ideal and Its Application to Algorithmic Accountability. New Media &

3. Society, 20 (3): 980–981.

4. Bijker W., Law J. (eds) (1992) Shaping Technology/Building Society: Studies in Sociotechnical

5. Change, Cambridge, Mass.; London: MIT Press.

6. Bloor D. (1991 [1976]) Knowledge and Social Imagery, 2nd еd., Chicago; London: The

7. University of Chicago Press.

8. Burrell J. (2016) How the Machine ‘Thinks’: Understanding Opacity in Machine

9. Learning Algorithms. Big Data & Society, 3 (1): 1–12.

10. Diakopoulos N. (2013) Algorithmic Accountability Reporting: On the Investigation of Black

11. Boxes. Report, Tow Center for Digital Journalism, Columbia University. (https://

12. academiccommons.columbia.edu/doi/10.7916/D8ZK5TW2)

13. Dougherty C. (2015) Google photos mistakenly labels black people “gorillas”. The New

14. York Times, 1 July. (http://bits.blogs.nytimes.com/2015/07/01/google-photos-mistakenly-labels-black-people-gorillas)

15. Edge D. (1979) Quantitative Measures of Communication in Science: A Critical

16. Review. History of Science, 17 (2): 102–114.

17. Fuller S. (1997) Constructing the High Church-Low Church Distinction in STS

18. Textbooks. Bulletin of Science, Technology & Society, 17 (4): 181–183.

19. Jasanoff S. (ed.) (2004) States of Knowledge: The Co-Production of Science and the Social

20. Order, London; New York: Routledge.

21. Johnson J. (1988) Mixing Humans and Nonhumans Together: The Sociology of a

22. Door-Closer. Social Problems, 35 (3): 298–310.

23. Latour B., Mauguin Ph., Teil G. (1992) A Note on Socio-Technical Graphs. Social Studies

24. of Science, 22 (1): 33–57.

25. Latour B., Woolgar S. (1986) Laboratory Life: The Construction of Scientific Facts,

26. Princeton, New Jersey: Princeton University Press.

27. Latour B. (1990) Technology Is Society Made Durable. The Sociological Review, 38 (1_

28. suppl): 103–131.

29. Latour B. (1992) Where Are the Missing Masses? The Sociology of a Few Mundane

30. Artifacts. W. Bijker, J. Law (eds) Shaping Technology/Building Society: Studies in

31. Sociotechnical Change, Cambridge, Mass.; London: MIT Press: 225–259.

32. Latour B. (1994) On Technical Mediation: Philosophy, Sociology, Genealogy. Common

33. Knowledge, 3 (2): 29–64.

34. Latour B. (1996) Aramis, or The Love of Technology, Cambridge, Mass.; London: Harvard

35. University Press.

36. Latour B. (1999) Pandora’s Hope: Essays on the Reality of Science Studies, Cambridge,

37. Mass.: Harvard University Press.

38. Latour B. (2004) Why Has Critique Run out of Steam? From Matters of Fact to Matters

39. of Concern. Critical Inquiry, 30 (2): 225–248.

40. Lipson H., Kurman M. (2016) Driverless: Intelligent Cars and the Road Ahead, Cambridge,

41. Mass.: MIT Press.

42. Madrigal A. (2014) How Netflix Reverse-Engineered Hollywood? The Atlantic, January

43. (https://www.theatlantic.com/technology/archive/2014/01/how-netflix-reverseengineered-hollywood/282679/)

44. Miller B. (2020) Is Technology Value-Neutral? Science, Technology, & Human

45. Values, First Published Online January 22. (https://journals.sagepub.com/

46. doi/10.1177/0162243919900965)

47. Neyland D. (2015) Bearing Account-Able Witness to the Ethical Algorithmic System.

48. Science, Technology, & Human Values, 41 (1): 50–76.

49. Ogburn W.F. (1964) On Culture and Social Change: Selected Papers, Chicago: University

50. of Chicago Press.

51. Pasquale F. (2015) The Black Box Society, Cambridge, Mass.; London: Harvard University

52. Press.

53. Pinch T., Bijker W. (1984) The Social Construction of Facts and Artefacts: Or How the

54. Sociology of Science and the Sociology of Technology Might Benefit Each Other. Social

55. Studies of Science, 14 (3): 399–441.

56. Pitt J.C. (2014) Guns Don’t Kill, People Kill”; Values in and/or around Technologies.

57. P. Kroes, P.-P. Verbeek (eds) The Moral Status of Technical Artifacts, Dordrecht, the

58. Netherlands: Springer: 89-101.

59. Spolsky J. (2000) Things You Should Never Do, Part I. (https://www.joelonsoftware.

60. com/2000/04/06/things-you-should-never-do-part-i/)

61. Stilgoe J. (2017) Machine Learning, Social Learning and the Governance of SelfDriving Cars. Social Studies of Science, 48 (1): 25–56.

62. Vanderbilt T. (2012) Let the robot drive: The autonomous car of the future is here.

63. Wired, January 20. (https://www.wired.com/2012/01/ff_autonomouscars/2/)

64. Wade L. (2010) HP software doesn’t see black people. Sociological Images, 5 January. (https://

65. thesocietypages.org/socimages/2010/01/05/hp-software-doesnt-see-black-people/)

66. Winner L. (1988) The Whale and the Reactor: A Search for Limits in an Age of High Technology,

67. Chicago: University of Chicago Press.

68. Woolgar S., Cooper G. (1999) Do Artefacts Have Ambivalence? Moses’ Bridges,

69. Winner’s Bridges and Other Urban Legends in S&TS. Social Studies of Science, 29 (3):

70. –449.

71. Weber A. (2012 [1920]) Fundamentals of Cultural Sociology: Social Process,

72. Civilizational Process and Cultural Movement. C. Loader (ed.). Alfred Weber and the

73. Crisis of Culture, 1890-1933, New York: Palgrave Macmillan, US: 165-205.

74. Wynne B. (1988) Unruly Technology: Practical Rules, Impractical Discourses and

75. Public Understanding. Social Studies of Science, 18 (1): 147–167.


Review

For citations:


Kuznetsov A.G. Neural Network Nebulaе: ‘Black Boxes’ of Technologies and Object-Lessons from the Opacities of Algorithms. Sociology of Power. 2020;32(2):157-182. https://doi.org/10.22394/2074-0492-2020-2-157-182. EDN: OJYPQZ

Views: 4


Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 License.


ISSN 2074-0492 (Print)
ISSN 2413-144X (Online)