The ability to capture, process, store, and present information is faster and less expensive today than it was 50 years ago. Gordon Moore, an engineer at Intel, predicted in 1965 that computing would dramatically increase in power while decreasing in relative cost, roughly every two years (Intel, n.d.). According to a recent article in The Economist, this maxim has stood the test of time over the past 50 years. However, the traditional method of shrinking the size of the transistor to pack more of them onto a processor is reaching its fundamental limit (2016). This technical limit has led engineers to move beyond the principles of classical physics which rely on mathematical rules by using clearly defined binary physical states (Burd, 2016, p. 24). In order to continue to improve processing capabilities, quantum physics is being used by combining classical physics with matter at the subatomic level such that matter can be in multiple states at the same time in a qubit (Burd, 2016, p. 24-25). This nascent technology is still being prototyped, and is far too expensive to hit the public market at this time.
As new architectures for computing are developed, engineers will need to pay particular attention to memory addressing. Today’s Intel processors maintain backwards compatibility for the original 8086 microprocessor which makes it difficult to process an increasing number of bits using faster methodologies (Burd, 2016, p 89-90). This was evidenced at the turn of the millennium when larger computer classes began using 64-bit addressing. The change in architecture caused software compatibility issues despite Intel providing memory addressing based on either 32-bit or 64-bit addressing.
While new advances in processor technology are developed, the average citizen has access to a wide array of consumer electronics that rely upon the classical processor. These microcomputer devices can include smartphones, tablets, e-readers, laptops, and desktop computers. These devices typically support tasks such as browsing the web, creating documents, editing spreadsheets, curating photos, using apps (or applications), or performing business functions using accounting software packages (Burd, 2016, 35). This class of computers sometimes challenges the definition of a workstation which it commonly referred to as a more powerful desktop computer. Workstations are often found in use for applications that require additional primary memory (RAM) for simultaneously running programs, graphics capabilities for applications such as AutoCAD, or multiple CPUs for statisticians who require faster processing capabilities. It may be argued, in support of Moore’s Law, that the capabilities of a workstation may resemble the specifications of the next generation of desktops.
Even though it may be easy for an average consumer to purchase off-the-shelf computing devices for casual personal use, it requires deep technical understanding of the technology to implement, test, and deploy systems for use in the enterprise. Understanding how these components interoperate is critical to a project’s success. In order to manage computing resources effectively, one must stay abreast of future technology trends through unbiased sources, such as those from professional organizations that are funded by memberships rather than specific vendors (Burd, 2016, 8-9).
References
50 Years of Moore’s Law. (n.d.) Retrieved September 12, 2016, from http://www.intel.com/content/www/us/en/silicon-innovations/moores-law-technology.html#
Burd, S.D. (2016). Systems Architecture 7e. Boston, MA: Cengage Learning
Double, Double, Toil and Trouble. (2016, March 12). The Economist (US)