8 Comments
User's avatar
Bricks Mover's avatar

Every word is gold!!!

Expand full comment
Mike's avatar

Great job Eric! Thank you.

Expand full comment
Eric Flaningam's avatar

Thanks Mike!

Expand full comment
Leslie Oosten's avatar

Awesome...

Expand full comment
Eric Flaningam's avatar

Thanks Leslie!

Expand full comment
Devansh's avatar

This will be a great alternative format. I think having both are great for different reasons.

Expand full comment
Eric Flaningam's avatar

Appreciate the support my friend

Expand full comment
James Emanuel's avatar

Today, silicon chips are facing some big challenges. One of the main issues is miniaturization. For years, we've been making these chips smaller and smaller, 'Moore's Law'. However, we're now getting to a point where it's really hard to make them any smaller. With transistors already down to 5 nanometers, we're running into problems like quantum effects that interfere with how they work. It's becoming tough to keep shrinking them without losing performance.

This is why they are now being layered, but that isn't ideal either. The big problem is heat and power consumption. As we pack more transistors into a smaller space, they generate a lot of heat and use a lot of power. This slows down the chip and even shorten its lifespan. To deal with this, new technologies like silicon carbide and system-on-a-chip designs are being developed to handle heat better and use less power, but they aren't long-term solutions, particularly given the power required by the new wave of AI applications.

Silicon is reaching the end of the road. It's the elephant in the room. No one wants to talk about it because companies like TSMC, Intel, AMSL and others have invested so much capital into silicon chip manufacture, that changing is a quantum leap. Yet it is a leap we need to make.

Researchers are looking into other materials and technologies, such as optical computing, quantum computing and chips constructed on other 2D materials, all of which offer possible solutions.

The big question then becomes, where does that leave companies such as NVIDIA and its silicon based GPUs?

I sincerely don't know the answer to this question, but it has been whirring around in my mind for a while. NVIDIA has surfed a huge wave, but when that wave dies, who is best placed to catch the next wave? Is it still NVIDIA? Or is this market ripe for disruption?

History tells us that RIM surfed a wave with it Blackberry until Apple launched the iPhone. Blockbuster surfed a huge wave with movie rentals until Netflix arrived. Is the semi-conductor industry the next to undergo this kind of change?

I have no idea, and confess not to be an expert in this field, so I welcome the views of others. Please educate me.

Expand full comment