A list called the 8 Great Ideas in Computer Architecture outlines
core concepts that have made the semiconductor industry so productive over the
past 70 years. The first of these says, “Design for Moore’s Law.” While Gordon
Moore’s prediction held, transistor density doubled faster than the design
cycle of most products. Products conceptualized around current technology would
be obsolete before they were released, so engineers had to anticipate and
design for future technology.
Moore halved his prediction in 1975, and the rate of change in transistor
density has continued to decrease since then, at least in general. To
compensate, hardware has been specialized for many different tasks,
notably with the use of GPU’s and TPU’s for machine learning and with
other processors optimized for machine vision. The Machine Learning for Systems
project at Google Brain, which uses reinforcement learning to optimize
chip floor planning, is an amazing example of how the principle of engineering
for projected needs and technology continues to be a core principle in silicon
design.
Only a small percentage of the tech industry focuses on chip design, but given the
rapid advancement of computer hardware and software in every sector of the industry, especially
in areas like edge computing and artificial intelligence, engineers should still “design
for Moore’s Law” or design products that not only make use of the most current
technology but also anticipate the capabilities of future systems and are
positioned to fill future needs. Engineers today are not looking at a single
area of growth, as Moore’s Law did, but rather at many different advancing
technologies in areas such as connectivity, data processing, sensors, electric
actuation, and batteries. How do engineers live beyond the edge of current
technology? Is this accomplished by specialization in one area, or by a broad
understanding of all of the technologies that combine in complex systems? What are some technologies that are following a growth projection like computer processors did 50 years ago?
Good questions! Interesting point that rapid advancement is happening in many areas, not just transistor count. For many tasks, we are in the post scarcity era of computing. My 10-year old workstation with a SSD drive does everything I need to do plenty fast. Unless you are working with video or some specialized data processing task, you can easily buy more storage than you’ll ever use in your lifetime. Cloud computing is almost free for small workloads (you can run loads of stuff on a $5/mo instance). A modern phone has more than enough storage to install a lifetime worth of apps we might want to use. So in many areas, it seems hardware and available computing resources has outstripped software’s ability to use modern hardware. OSS has greatly accelerated the development of software, but it seems we still have a long ways to go. Machine learning seems to be a huge focus these days that could greatly improve the usefulness of software and more fully utilize hardware.
Would be interested to find a corresponding law for software development, but software metrics are much messier than measuring transistor counts, and software is typically not “manufactured.”
To “design for Moore’s Law” requires that you somewhat keep up with what is going on. I feel a good bit of my job is stay abreast of technology and help my customers (mostly smaller verticals) integrate new/advanced technology into their products at the right time. I’m more of a generalist – jack of all trades, master of none. Blogs (managed in Feedly), Podcasts, subscribing to dozens of Discourse forums, email newsletters, Slack, and Discord are all valuable sources of information. There is a lot of useful information on Twitter, but the noise level is so high that it tends to be a huge time sink. The challenge is keeping up without being overwhelmed by the noise. That is one of my hopes for this community – to attract strong technologists from different areas who can share and summarize relevant information. The person who shares benefits from the clarity of thought that comes from writing a few paragraphs about something. Everyone else benefits from the high signal/low noise information.
Modem systems are complex, and innovation occurs at the intersection of disciplines and perspectives. Most communities are specialized. This community is an attempt to collect a more diverse range of interests and skills.
Very good points, computers have come a long way in 20th century and early part of 21st century. I think Moore’s law has long served its purpose and so have the time tested computer architecture ( von neumann ), along with they have carved more areas into computers fold, if you see the computer application then it has broadened a lot which means the field is no longer as narrow field as it used to be until 90s or 2000s, so you will see distributed expertise in specific areas, which will have to work together to build complex solutions. What we humans are doing is creating a single distributed computer for the world. If you take that view then technologies start to fall in places.
So firstly, its important to see which area would be strong points one would like to concentrate on, that should be your strong expertise, however, one who them can take that expertise and apply it to adjacent fields and solve larger problems would be a successful engineer. Programming will reach common man, meaning everyone will know/have-to program, now some of them will be more prolific and drive the technologies forward. Program complexity is increasing and new programming style to improve the quality of programs will be required and should be a keen area to watch out for. Since these are best tools which will make a difference. It was not the gold diggers who got rich in gold rush, but the one’s who supplied the tools.
Pretty neat! Floor-planning is a critical activity for highly pipe-lined/parallel designs. Years ago we desiged a medical printer and had a hard time getting a FPGA to meet timing. The design had 5 parallel data pipelines flowing, so once we figured out how to utilize the floor-planning features of the FPGA software, it worked perfectly. Below is a screenshot of the design. The data pipelines were essentially long shift registers that mapped perfectly into the geometry of the FPGA.