Trends in information technology
Jaron Lanier is essentially a philosopher with functional experience in the technological information world. He explores current trends in information technology and uses this to construct how the future will look – how large firms (Siren Servers) exploit current knowledge without realising that jobs will be lost. He questions how we might future-proof jobs. Lanier suggests a two-way linking system “if the system remembers where information originally came from, then the people who are the sources of information can be paid for it” (Lanier, 2014, p. 218).
Lanier gives an example where Japan is facing a severe shortage of working-age people and a huge population of elderly people. The Japanese propose to create robots to take care of the elderly, robots are already able to handle delicate tasks. Lanier suggests that robots will be in use in Japanese nursing homes by 2020. He says that the programming of these robots depends on information gained from observing a nurse and that the nurse should be compensated for that input. Lanier advises that the middle-class is increasingly disenfranchised from online economics. By convincing users to give away valuable information about themselves in exchange for free services, firms can acquire large amounts of data at virtually no cost. “Information is people in disguise, and people ought to be paid for value they contribute that can be sent or stored on a digital network” (Lanier, 2014, p. 235).
Last month Google’s AlphaGo computer defeated Lee Sedol in a board game of “Go”. The game is a game of intuition rather than quick thinking (Bingemann, 2016). This victory has huge implications for the future of computing according to Beijia Ma, a robotics and AI expert. She says that the information age is on the verge of another huge step which will provide more computational power. Ma states that “computing power is not just about hardware anymore” (Bingemann, 2016).
Ma discusses how AI emulates the behaviour of the human brain. Machine learning “which is essentially about simple pattern recognition. But deep learning is the next generation of that and works architecturally through artificial neural networking” (Bingemann, 2016).
Reilly says “there is no doubt we are nearing the limit for Moore’s Law on silicon-based chips” (Bingemann, 2016). He suggests chips will have to be made from other materials. Reilly also discusses economic consequences and suggests that we should look at how to maximise the use of chips in specific applications. How to optimise chip design without using more transistors (Bingemann, 2016).
He says that custom-built chips are already in use in other parts of the computer industry. He talks about Microsoft’s search engine “Bing” using a specialised chip called FPGA – field programmable gate array, which can reprogram hardware circuits (Bingemann, 2016). The main area of Reilly’s research is quantum computing. He says that quantum computing harnesses the laws of quantum mechanics to process information. This technology would use quantum bits or qubits, which can hold much more information and can process a vast number of calculations simultaneously (Bingemann, 2016).
Bingemann, M. (2016, April 9). A super power on the rise in the cloud. The Weekend Australian.
Lanier, J. (2014). Who owns the future?. London: Penguin Books.
Tags: AlphaGo, go, Jaron Lanier, quantum bits or qubits, quantum computing, quantum mechanics to process information, Who owns the future