The Evolution of IT into Artificial Intelligence

Information technology is the cornerstone of modern society, "the information age just began in the 1990's, with human activity shifting from traditional industry to creating/managing/using computerized information." (Vahid et al., 2015) By noting the information age beginning in the 90's with much of society now dependent upon IT, we can draw the lines to the intrinsic link it has with artificial intelligence. A natural and fundamental progression of technological innovation when looked at through the lens of core components of IT we can better understand the emergence of AI. This post delves into the initial scope and some history of IT and how we've developed AI the means to run and develop new methods of what initially were the fundamental pillars of IT. 

The pillars of IT as I proposed are as follows hardware, software, networking, data management, and cybersecurity. These lay the ground work for the development and implementation of Artificial intelligence with the information technology framework. Hardware advancements are what provides the necessary computational power required for such algorithms. In the 1940's when computers were the size of a large rooms there would never have been the conceivable notion these machines would self-compute with the available resources though they had an idea of the artificial neuron. For software a variety of code languages exist providing the availability for training AI models. Generating instructions and algorithms for AI to then replicate a human neural network of synapse for creating and dissemination of information. (Zocca et al., 2017) Networking will provide the means in which in which data transmission can be sent and received between these machines, also allowing for instantaneous information gathering sorting and transposing to usable language. Data management is the critical piece for creating repositories within the AI and IT framework for volumes of the information for these machines to develop and learn from or create datasets of newly created information based on it's learning. Finally when it comes to AI and cybersecurity AI is still housed with in critical IT infrastructure and requires technical and software driven means of protection. Whether AI is used to generate safeguards or in the past when one would utilize specific software to protect ones device and minimize threats online. AI alike the IT framework both utilizes, characterizes, creates, and above all requires cybersecurity. 

The intersection of AI and IT both require data which is a critical resource that is the ongoing enabler for the constant innovation, development, and assurance of the continuity of the intersection. AI within the frame work of IT extracts insights and data from these IT systems, which allows for their machine learning and development. This relationship is broken down in the data model below with excerpt from Orhan Alav.

"Within this structure, there are multiple components such as IT/network/internet/w3/IA, data, research data, big data, data preparation, evaluation, data processing, data modeling, artificial intelligence, IT/network/internet/w3/IA, value creation and sustainability, and their interactions." (Alav, 2024)








With the relationship being a symbolic one between AI and IT, as further discussed in detail with the data ecosystem for Alav. Within AI their are pillars that rely on the IT framework, machine learning, algorithms that enable the systems to learn from the data provided by the IT framework. Natural learning processes, which is the the provided by the data management within the IT framework as the training interface for which allows AI to learn and understand the human language. Finally the last two which derive from hardware and software of the IT framework is computer vision. This is the algorithms ability to search and sort through and analyze all visual information available to it and create it's data to derive image information from and allows for tools like the AI picture bots. Robotics is a quite literally a pillar utilizing all the pillars of AI and IT to generate learning models to then be attached to physical system to interpret and conduct physical automated tasks in real time within the physical world all whilst communication within the digital ecosystem of IT. 

There are many challenges and ethical considerations when it comes to the symbiotic relationship of IT and AI. With ethics involved and the idea of AI being allowed to automate and retrieve information within in an organization or government level IT infrastructure, trade-offs and principles are required to be implemented. "high-level principles contain a set of underlying themes and aspects, which typically includes accuracy/performance, robustness/safety, fairness, privacy, explainability/interpretability, transparency, and accountability" (Sanderson et al., 2024) The need for a regulatory framework is a necessity, however with the landscape still being developed on this front, much of these policies and means are that of a "catch-up" method to compete with the growing technological advances. The future will require much acknowledgement of these trade offs and principles within the AI and IT framework. As we already see prospects of driving completely redefined industries as annotated in my paper of AI's uses in cyber security. As well there must be an equitable access and ethical implementation for the sustainment and progress of these artificial intelligent machines. 

To conclude, the nexus of information technology and artificial intelligence is one ripe with transformative benefits and enhancements for our time. With IT providing the foundation infrastructure, allowing AI to blossom whilst AI enhances the IT systems through its automation of data driven insights. Together, both create a synergistic forward driving relationship that reshapes the digital landscape for industries, governments, and individuals alike. This relationship and integration holds a vast potential to vastly improve open the current digital society we live in today and shape human progress. While we must foster a balance between our own drive to perfection and address the ethical concerns of innovation and human/machine responsibility. We can strive to unlock the full potential of IT and AI to pave the way for a smarter more sustainable and connected future. 


References:
Vahid, F., Lysecky, S., Wheatland, N., & Siu, R. (2015). TEC 101: Fundamentals of Information Technology & Literacy (7th ed., Vol. 1). zyBooks. 

Zocca, V., Spacagna, G., Slater, D., & Roelants, P. (2017). Python deep learning: Next generation techniques to revolutionize computer vision, AI, Speech and Data Analysis. Packt Publishing.

Alav, O. (2024). Will artificial intelligence replace knowledge centers? assessment of the situation. Mimarlık Bilimleri ve Uygulamaları Dergisi (MBUD), 9(1), 182–194. https://doi.org/10.30785/mbud.1441134 

Sanderson, C., Schleiger, E., Douglas, D., Kuhnert, P., & Lu, Q. (2024). Resolving ethics trade-offs in implementing responsible AI. 2024 IEEE Conference on Artificial Intelligence (CAI), 1208–1213. https://doi.org/10.1109/cai59869.2024.00215

Comments