There are a lot of things you can say about Silicon Valley, but one of the more interesting aspects of it for me has been how "small" it is in terms of people have an oversized impact. I have never been one of those people of course, just a part of the entourage. But it has been interesting to watch and learn from folks who are good (and bad) role models.
A few years back when the teacher retired, we all gathered together to thank him, including Pat! World-class guy, and makes us proud! So great to share a story with students, who come from nothing, how you can leave our hometown and eventually be the CEO of Intel.
Wishing Pat the best of luck in his new role!
Nations like Finland and Poland have figured out how to make that change and seen their students flourish; hopefully we will too.
Doesn't that mean that a lot of people will never be on the receiving end just because they did not win the teacher-lottery (arguably the chance of getting such a teacher is rather slim)?
Shouldn't our systems be more robust? (Not talking about normalizing these effects down, but creating a better environment for everyone, using these effects en masse)
We need our very best to go into teaching. It's the ultimate multiplier role.
But also I think about how much crap they put up with in those roles, and how relatively low-stress my life is, and then I feel better about it. They're cut out for that kind of job, and I most certainly am not.
Intel's technical failure with 10nm has gone hand-in-hand with financial success with 14nm. That is, without 10nm chips on the market in a meaningful way they've been able to raise prices for 14nm parts -- Intel put up better financial numbers than ever in a time when it has not been investing in future success.
VMWare was a big thing in 1998 but it was obsolete by the time Pat got involved -- a hypervisor is naturally part of the kernel and there is no way cloud providers are going to spend their margin on VMWare. Yes, many people in business are terrified of open source software and want a proprietary product so they have somebody to sue (VMWare) or they need somebody to hold their hand (Pivotal.) Either way, VMWare and Pivotal are units that can be merged and spun off whenever a company based in Texas (Dell) or Massachusetts (EMC) wants to look like it has a presence in San Francisco -- you see the vmware logo on CNBC every morning and somebody thinks the king is on the throne and a pound is worth a pound but that doesn't mean anything in the trenches.
Like Intel in the past 10 years, VMWare is entirely based on a harvesting business model. In the short term Intel made profits by pandering to cloud providers; but in the long term cloud providers invested their profits in better chips. (What if Southwest Airlines had developed a 737 replacement designed from the ground up for a low cost airline?)
Pat might be able to keep the game of soaking enterprise customers going for longer, but someday the enterprise customers will be running ARM and the clients will be running ARM and the coders will be thinking "did they add all of those AMX registers just to put dead space on the die to make it easier to cool?" and falling in love again with AVR8.
- VMware grew almost 2.5x under Pat. Obviously their core franchise is about harvesting, but new adjacent products like NSX and VSAN helped a lot with growth. They just took a long time to get traction given the customer base, who don’t like change. VMware Cloud on AWS has been surprisingly strong even to skeptics like myself.
On the other hand, companies would get rid of VMware if they could, and they tried and failed with OpenStack. The hypervisor is commodity, but the overall compute/network/storage private cloud system isn’t trivial. Turns out “holding hands” whether by software or services is pretty valuable?
Public cloud of course is a substitute, though private clouds and data centres have survived and thrived as well (for now).
- Pivotal had a boutique software and services model that would be difficult to scale as a public company given the current shifts in the enterprise fashion away from productivity-at-any-cost (PaaS and serverless) and towards perceived low cost building blocks (Kubernetes and its ecosystem).
But it would be an gross oversimplification to suggest this was merely a vanity project for EMC and Dell. It took in $500m+ annually on open source software (and another $300m in services), which is no small feat, though still minuscule given what surrounds it. But anything new has to start somewhere. Not enough to impact the mothership’s balance sheets, but there was something special there that could be nurtured. Whether it can be, or whether the differences matter enough is anyone’s guess.
Pat could take Intel in a direction we don’t anticipate. VMware was written off for dead when Pat came on but he managed to buy it another 10-15 years and at peak a doubling of the stock price. He’s probably learned from that.
Hypervisor is a commodity, however management and support of hundreds or thousands of them is not. You can either pay people to support them and fix the software when it breaks or you can pay <vendor name here>. Given the former requires expertise and planning it's often more cost effective to go the latter.
Disclaimer: I'm employed by VMware (less than 1 year) and chose to come here based on pivots I feel they are making.
The "actual" problem could be anything from the customer misinterpreting the datasheet (Marketing/Comms problem), to insufficient testing (Factory/Production problem), to chip function (Design Engineering problem). As a "new college grad", or NCG in the lexicon, I admired how he dug out details from various folks to get to the real problem. He always had ideas for things Systems Validation (SV) could do to maybe trigger the problem that made sense to me. He really embodied the philosophy of fixing the problem not fixing blame on some group.
My Pat Gelsinger story: I worked in a successor org to MIPO, in those days called MPG. One day during pre-silicon validation on the Pentium II, one of my NCG’s came to me and said: “An old test from the historical test archive is failing in the simulator. I want to find the author and ask him about it. Do you know a Pat Gelsinger?”
Me: “Well, he is an Intel Fellow now, so he might not remember what that test was supposed to do. “
I believe this is what you feel when you come to the industry when it is just starting in a given locality, and then it explodes. Senior workers who join in the earliest days tend to drift from company to company, but they rarely leave the place entirely, or change industries.
This effect can drive people to over-pay to go to "good schools" or work for less than they think they should be paid at the "good places to work" because it lets them join a cohort with members who are statistically more likely to be "successful" (for some arbitrary definition of success). I personally never paid much attention to it but thought about it when talking with this person.
I know from experience that it happens in Silicon Valley that someone will say, "Oh I know someone at some previous company who is really good at that, let me see if they would be willing to change jobs." That is why companies pay people bonuses for "referrals."
I suspect it happens in LA in the entertainment industry as well. If you watch the old American television series "ER". Produced by Steven Spielberg's Amblin Entertainment production company, it is amazing to see people who were on that show first as guest actors and then show up later in series.
"The only one who may have a slim chance to completely transform Intel is Pat Gelsinger, if Andy Grove saved Intel last time, it will be his apprentice to save Intel again. Unfortunately given what Intel has done to Pat during his last tenure, I am not sure if he is willing to pick up the job, especially the board's Chairman is Bryant, not sure how well they go together. But we know Pat still loves Intel, and I know a lot of us miss Pat."  - June, 2018
"This is the same as Intel pushing out Pat Gelsinger. The product people get pushed out by sales and marketing. Which are increasingly running the show at Apple."  30 Days ago.
And numerous other reference since 2009. Many more around various other forums and twitter. I am getting quite emotional right now. I cant believe this is really happening. ( I am wiring this with tears in my eyes! ) I guess Andy Bryant retired makes the decision a little easier. And Pat has always loved Intel. I guess he is pissed those muppets drove it to the ground.
This is 12 years! 12 years to prove a point! Consider 4 - 5 years of work in lead-time since he left in 2009. That is 2014. Guess what happen after 2014?
May be it is too little too late? Or May be this will be another Andy Grove "Only the paranoid survive" moment?
The King is Back at Intel, despite being a fan of Dr Lisa Su, I am little worry about AMD.
I wish him luck, though.
As Adam Smith said, "there is a great deal of ruin in a nation", and likewise, big companies generally get more opportunities to reinvent themselves than one would expect.
Tired brain read this as "there's a great deal in a ruin of a nation." Close enough, I guess?
Competitors maybe passed them in the last couple of years, but Intel is still in a really solid position to turn things around.
Apple's situation was a lot more dramatic. Intel still has a very good market share and can probably take a couple+ years to safely get back on track, in my opinion.
Amongst other things:
- SGX is really far ahead of AMD SEV. The latter is probably easier to sell because it's marketed as "drop in" (encrypts whole VMs), but SEV has been repeatedly hacked or shipped with obvious gaping design holes that they patch later and call features. SGX is a lot more focused, a lot more flexible and frankly a lot more secure.
- Optane NVRAM is completely unique, as far as I know. It's only 10x slower than DRAM which is nothing, but it's persistent! It totally changes the whole IO hierarchy.
- AVX512 / DLBoost are able to hold their own against mid-range GPUs for some AI tasks, which is impressive and useful.
- Their core chips are still very fast. AMD chips are selling more for less, which is a good position for them to be in, and TSMC's process advantage is helping them out for now. But they don't have a truly massive edge in tech like Apple's competitors had when Jobs returned.
- Intel have a long tail of obscure features that AMD doesn't, although it's often hard to know this. SGX and AVX512 are high profile but there are others that are less well known.
I don't count side channels as an issue because it's also an issue for all their competitors, and frankly I found the near single-minded focus on Intel by the security community to be rather misleading. I even read a side channels paper that admitted they suspected AMD had the same problem but they didn't bother to check simply because they didn't have access to any AMD hardware in the first place, which was unusually honest.
Apple's M1 is very impressive in its space, but for high performance cases like servers it's only quad-core - you can't use the little low power cores for much. Apple have shown no interest in making server parts for a long time, and Apple's engineering is bespoke so it tells us nothing about what other ARM vendors can do. So in that space it's still just AMD vs Intel and Intel is a long way from being on its back yet.
I feel like the view of Intel as a "sinking ship" is an inside-basebally misread of the situation.
If you buy a PC or a laptop or a server today, you're most likely getting an Intel CPU. It's now at least possible to buy AMD in many market segments from major vendors - and of course many of us do - but to the broader consumer and hosting world Intel still dominates.
You can look ahead and extrapolate and say "they can't compete with TSMC right now, and maybe they're going to start falling behind further." Fair analysis. But they dominate the market to an extent that's hard to overstate.
Contrast this with between-Jobs-stints Apple, which was a tiny shrinking company with a niche product and no clear strategy.
Intel is almost in too-big-to-fail territory. They aren't sinking, just because they now have real competition. Yes, they need to do "something" to maintain dominance - but why would you see their situation and conclude they won't? The new CEO here is a sign that they see the trouble ahead, and they're ready to steer around it.
If you have $233 billion (a LOT of money) and you want to own a world-class chipmaker, would you rather start from scratch or just buy Intel for cash?
Then there are companies that have at least managed to stabilize themselves reasonably, e.g. IBM and HP (I wouldn't consider either of them a huge turnaround success, but I would not expect them to head for bankruptcy anytime soon either).
And even companies that did go under, e.g. Kodak, took an enormous amount of time and effort to do so, and not without launching a cryptocurrency first…
Until still have some amazing technology compared to AMD, they're five processes screwed right now but I think they have a chance to succeed. But given the timeline on new CPUs and fab processes, it'll be 4 years before we see the fruits of anything he does.
If they actually want to survive long term, there are two paths as I see it:
A) Be legitimately better than AMD, this could include opening up the management engine, much higher performance chips at lower price points, or some sort of space magic utilizing their Altera acquisition.
B) Embrace RISC-V and push it to laptops and desktops HARD, while not pulling the Microsoft Embrace Extend Extinguish™ play. If they go this route then their stock becomes an exceptionally strong buy IMO.
There's a lot of interest in that from a national security perspective anyway.
Part of TSMC's offer is that they won't do design, they won't compete with you on design etc.
They specialize in the best possible fab and service for your design and that's it.
I want to be able to train NNs with a laptop that can normally last 20 hours.
I realize that WinTel machines arent going anywhere because of entrenched business use -- but between NVIDIA GPUs for heavy workloads and M1s for day-to-day activities -- what is the feeling inside Intel right now? Is this like a Microsoft-Netscape moment in the 90s?
What it should definitely be, however, is a death-sentence for x86. I bet Intel are proud of it, but ultimately they must be at least somewhat jealous of those who don't have to use it.
We don't really know why Keller left. All we can say is that from the outside it looks like he might have just given up in disgust. Having an engineer at the helm again can make a big difference to an engineering organisation.
It'll be interesting to see what happens, I had written them off as on the path of inevitable decline and irrelevance.
If they have someone as CEO who understand the existential threat they're facing from everywhere maybe they'll survive.
The irony of the other top comment is that the AMD threat wasn't the competitive threat that mattered. AMD is also screwed.
We see it now with Google now being the "evil empire". They haven't had a real hit since Android and seem to be floundering, but online ad revenue is such a huge geyser of cash they're gonna be fine for a very, VERY long time.
I’d argue IBM largely is irrelevant today, but they technically still exist.
Google is lucky they have an ad monopoly because they don’t really have a coherent company vision, I wouldn’t buy their stock. They might get lucky with their deepmind purchase.
I don’t disagree with you, but if I was choosing companies in a strong position today it’d be Apple, Amazon, and Microsoft. I wouldn’t make a long term bet on Google or Intel.
The point is Intel still holds a huge amount of market share and has an enormous amount of cash. The new CEO seems to be universally praised in these comments and look at what Lisa Su was able to do at AMD.
x86 is on the way out and can't compete on power or performance. In the end RISC won, it just took a while to get there.
As things trend that direction AMD and Intel don't really have much to offer, they're competing on legacy tech.
With TSMC providing fabs for designs from anyone, the serious players don't need AMD or Intel. Apple has a massive lead here, but others watching this will follow it.
What exactly do you think is the fundamental limitation of x86? Most chips do lot and lots of crazy logic to go from instruction set to microcode, it's hard to imagine that the variable nature of x86 instructions are the limiting factor.
The "x86 tax" (if it exists, I guess) is usually estimated at somewhere on the order of 5% to 10% - which is a lot of money at scale but probably not enough for a total rethink.
There’s also the coordination problem with windows and third parties. Without vertical integration and tight OS support they’ll fall behind.
Something will have to give, but the future doesn’t look good.
Cost a bunch of engineering time and forward motion, internal politicking. Eventually it got binned after months of not getting what we wanted out of it. There was no technical reason for it.
Maybe hardware really is his thing, but that quid pro quo hurt productivity.
My purely personal view is that VMware's second act has begun and it'll do well. Pat deserves some of the credit for accepting that Kubernetes would be the future of the business and throwing his weight behind it.
There are aspects of Pat Gelsinger's leadership that I dislike, but they're orthogonal to his management style and foresight. He's been effective.
I’d be incredibly happy to have him if I were an Intel employee.
>A few years ago, back in 2016, Intel did a “RIF” (reduction in force) of about 11%. Intel had previously done a significant reduction way back in 2006 of about 10%
>In an industry that runs on “tribal knowledge” and “copy exact” and experience of how to run a very, very complex multi billion dollar fab, much of the most experienced, best talent walked out the doors at Intel’s behest, with years of knowledge in their collective heads
bottom line: Intel created the hole by itself and jump into that deep end.
Many of the competent people I knew at Intel have left (not all), while many of the incompetent people I knew are still there.
The intuition is that older employees cost more and by cutting them you can reduce your payroll more significantly while doing what looks like smaller employee cuts from the outside. This is often viewed favourably by investors because on paper it doesn't seem as the company is stalling (head count is still high, costs are down). The obvious issue is that these older employees are not easily replaceable and you end up losing more velocity in the long run than originally anticipated.
The above is more applicable to traditional blue-chip businesses where workforce movements are more limited. For software engineering (which Intel is not really) your assumption is correct and once cuts are announced a lot of your great engineers will jump ship.
The above applies to everyone. When I was an intern the company folklore was full of horror stories because the last guy knew anything about a very profitable product died suddenly. (the product was for mainframes: clearly near end of life, but it was still mission critical for major customers and got had to get minor updates)
I've also known important people to find a better job. Even when an offer of more money gets them to stay, my experience is they always wonder if they made the right decision and so are never again as good as they were before.
Moral of the story: don't allow anyone in your company to get irreplaceable. This is good for you too: it means you won't stagnate doing the same thing over an over.
Even Swan is probably not an idiot, he simply expected everyone to struggle as much as Intel on the 7/10nm node and when TSMC just breezed past Intel and AMD came out with a much better product than anticipated he found himself in very hot water.
(He could also be quite the idiot, I don't know him)
I don't think tech companies can afford this practice. So much knowledge resides in their senior talent, and the hard-won experience-based understanding and things gleaned through opportunistic exposure that they seem to voluntarily surrender.
He might have what it takes to turn Intel around.
The heavy reliance on the compiler for ILP was an “odd-choice” but not something that was unsound in principle.
If the ecosystem was more open from the get go and more vendors were involved it had a much better chance of taking off.
And if nothing else at least it was something new.
The biggest disappointment I have with Itanium is that it and later Larabee/XeonPhi kinda pushed Intel even further into their own little x86 box when it came to processing units.
I think that failure is also why they haven’t really done anything interesting with Altera.
It would be interesting to see an explicitly JIT-based approach to ILP.
Is the board leaning into the usual MBA moves 101 and turn Intel into a "services company" gradually going fabless and milking those sweet patents OR will they put the work boots on and start building an actual tech company with the people who actually can save them on the payroll? cutting on the usual contractors meat grinder and invite the vast armies of middle-management and marketing drones to leave?
Probably not, considering the guy they're throwing out the back door is a finance dude with an MBA and Gelsinger was/is an actual engineer.
I think there's an issue with helping Intel temporarily, because if you're TSMC you'd rather use your capacity to serve long-term partners rather than helping Intel bridge the gap to 7nm only to get dropped a couple of years from now when they get their chips in order.
It's like "flex". I don't hate them (I hate "relatable" and "addicting" but those are apparently acceptable as real words now) but it's odd how they seem to bubble up suddenly out of nowhere.
(British by the way - that might have a bearing)
The metaphor works for me ¯\_(ツ)_/¯
Ah. It makes a lot more sense now. I never made that connection.
I play it by ear and there's a high chance the "leaning into" expression was used incorrectly. I meant to say, the board was more inclined to follow a roadmap than other options.
I'm my opinion, the secret sauce that makes Intel dominate certain industries is software. And it has been for some years already.
If you need really fast mathematical number crunching, e.g. high frequency trading or realtime audio filtering, then you need MKL, the Intel math kernel library.
If you want to further reduce latency with parallelism, you need TBB, the Intel thread building blocks.
Raytracing? Intel embree.
Once you are locked in that deeply, the raw Intel vs AMD performance becomes meaningless. You only care about how fast the Intel libraries run
So a CEO with experience building high performance low level software seems like an amazing fit.
Edit: And I almost forgot, the Intel compiler used in pretty every PC game to speed up physics. Plus some people have seen success replacing GPUs with icc+avx for huge deployment cost savings in AI.
Roughly 40% of their revenue is consumer chips where, apart from some games optimisation, they are no longer standing out from the crowd, and the leader is arguably Apple, with AMD doing well. The next ~30% of their business is servers, where there may be a significant number of HPC clients, but the bulk of this is again likely to be VMs running non-Intel specific software, and this market is starting to realise that Intel is nothing special here.
Looking at their revenue breakdown, I struggle to put more than 20% into the things that you mention they are great at. Should they focus on this? It would lose them much of their market cap if they did.
You lost me at Apple. Apple owns around 15% of the PC market space and almost the entirety of that is Intel-based systems. Outside of HN, nobody cares about the M1 chip, it isn't a selling point to my mom or her friends. If someone at the Apple store recommends it they might buy it instead of an intel-based system but it definitely isn't something they're seeking out.
The only threat Intel has right now in the consumer space is AMD, and it's a very real threat. AMD won both Sony and Microsoft console designs, and the mobile Ryzen 5000 chips released at CES look to have enough OEM design wins to put a serious hurt on Intel in 2021.
Even if Apple goes 100% M1, there's the other 85% of the market that Intel is likely far more concerned about.
I can absolutely see Qualcomm offering laptop chips off the back of the M1's success. They may not be as good, but they might be much cheaper. I can also see Microsoft pushing Windows on ARM harder, and rolling out their own chips at some point.
Also once the market gets "used to" multi-architecture software (again), I think we'll see a renaissance of chip design as many more players crop up, because of the lower barrier to entry.
They can't even remove decades old legacy code from their own products, so good luck everyone else.
If AWS and Apple can do it, soon other very large companies will, but in a few years, even OEMs will be able to develop their own chips. The market for high end gaming is unlikely to be touched, but the vast consumer market is going to be eaten by custom made ARM-based chips.
So in a world where processor design becomes a commodity, what does that mean for Intel and AMD? And what does that mean for the overall datacenter, consumer markets?
Processor design is already a commodity, and has been for many years. Any company with the cash can buy a license to the ARM64 instruction set and the reference core design, and have someone like TSMC or Samsung manufacture it.
These designs haven't taken over desktop market from x86 yet because those designs simply weren't performance-competitive with what AMD and Intel are pumping out, and it's not clear that that'll change anytime in the foreseeable future.
Apple knocked it out of the park with the M1, but they've been kicking ass for years, including their competition in the ARM processor space. Just because Apple's processors happened to use an ARM instruction set, doesn't imply that an ARM revolution is upon us.
Gargantuan battery life isn't a selling point? For laptop? In what universe?
You can find a place to plug in at basically any coffee shop or library you go to. My mom isn't spending 10 hours in a datacenter, so it doesn't really matter to her if the battery life is 3 hours or 12. For the average consumer, battery life has just been another stat on the spec sheet for years now.
I'm not sure I agree with this. I think if you asked someone whether battery life was a priority, they might say no. And if you asked them to rank tech specs I'm not sure it would necessarily be that high either. But the experience of using a laptop with a noticeably better battery is, for me, quite likely to be one of those things that you didn't know you were missing, even if you just charge it every now and then.
So, does your mom also not care about the weight of her laptop?
Seriously, I predict we will see Apple successfully attack the sub $1000 laptop market within two years. They sell the iPhone SE with an A13 for $399 so they could easily do so now they no longer have the 'Intel tax'. And the products will be a lot better than the Windows equivalents.
Most home users might use Office and that's about it. The allure of the Apple ecosystem will be strong especially for iPhone users.
Laptop batteries are also expensive in terms of money, weight, and bulk which puts Intel into a much larger bind.
Then Apple's success with the M1 will spur others - I would not be surprised if Microsoft follow them down the same route.
Marketshare is not what Apple is about. Apple is about profitability and control. Their move to own silicon is driven by improvements in the reliability of their build pipeline (no more waiting for tic-tocs and whatnot) and tighter control / integration of their whole stack (same arch on phones and pc). That these chips happen to perform so well that they are potential market-growers, is a welcome coincidence.
It's certainly partly defensive - they were frustrated with Intel - but Apple would only make a move of this scale if it thought it created business opportunities for them.
(SAP is the largest non-American software company by revenue and does business management, workflow automation, and bookkeeping)
My prediction is that outside of hipster startups, M1 will have no effect on business laptop sales.
This is what everyone always says but the iPhone kicked off a whole BYOD trend that has ended up with many high-value employees caring a lot about what tools they have to use, and a lot of software engineers want Macs.
A lot bigger companies than “Hipster startups” use Macs, this tends to start with the C-suite and people follow suit.
My point also wasn’t that everyone was going to switch to Mac. It was that M1 proves you can build a “better in every way” PC with an ARM architecture. Linux ARM is also being pushed by AWS heavily from the server side with impressive price/performance numbers.
Windows ARM has been failing for many years, but I suspect this is going to change. Microsoft has a talented virtualization group, where the HyperV roots go back to the Connectix Virtual PC team that built PPC/x86 emulation for the Mac. I suspect they can pull off something like Rosetta - they just need a chipmaker to collaborate with. Might even be Intel! Pat Gelsinger is an outside of the box thinker.
You remind me of folks that thought the iPhone / iPad would have no impact on Blackberry sales, as real businesses need keyboards.
Many companies have long ago set up some beefy Citrix servers for those application.
Really? That's surprising to me. I'd imagine that for the demographic of her and her friends, quality of life increases for their phones are far more material than for their computers.
> phones are far more material
Thus they don't really care about laptop battery life.
I think that is why Intel should be (and probably is) worried about Apple. They will make Intel redundant by having solved a harder problem which their own problem becomes a subset of.
In the consumer segment, you have regular people trying to make vacation videos with software like Adobe Premiere and Adobe Media Encoder, or Magix. Nvenc quality is bad. AMD is horribly slow. The only fast high quality encode is with Intel's dedicated CPU instructions, which both apps heavily promote to their users.
And the 30% that you mention that run VMs... Wouldn't they be pretty happy if Intel added dedicated CPU instructions to make VMware better?
I agree that for the work that I do, AMD is as good as or better. But people doing highly parallelizable tasks like compiling are the minority.
I just don't think the market for home devices is thinking about their video encoding time when they buy a laptop, but I do think they'll use an M1 Mac and find it surprisingly fast, or hear from a friend or family member that they are really good.
Intel just haven't been optimising for the main user experience seen by these people, or those writing "normal" server software either. They've been pushing AVX512 instead, which looks good for video or things like that, but not for regular use-cases.
That only happens in California.
"4K Ultra HD video editing with Intel and MAGIX"
"Enjoy HD Video editing with Magix Movie Editor Pro and Intel Iris Graphics"
"Edit in 4K Ultra HD" + Intel Logo
"Finish and Share videos quickly with Intel Iris Graphics"
Plus, as a user of the software, I can tell you that if you tick the "Hardware Acceleration" checkbox on AMD, a popup will tell you to buy a supported Intel CPU and then turn the checkbox off again.
BTW I'm picking Magix here because in the local electronics store, that's the video software that you can buy as a box and that is featured in bundles with Intel laptops. So if someone clueless walks in there and says they need video editing, this is most likely what they will end up with.
Not sure where you're getting that these days? Absolutely in the days of Bulldozer, but AMD's Zen 3 architecture has taken even the single core lead from Intel, not to mention the multi core lead they've held for several years now.
AMDs GPU encoder still lags a way behind Nvidia for example
You might need a source for that.
AMD's latest consumer-level chips significantly outperform Intel's chips in both price and performance. When talking about prosumer video editing performance, the Ryzen 9 5900x, the second most expensive "new" chip from AMD is a 3.4% performance improvement over Intel's most expensive "new" chip 10980XE. Additionally, the 5900x retails for $549 USD while the 10980XE retails for about $1,000 USD.
They (and AMD) did years ago. Intel VT.
They have this today. What would make them happier is cutting power utilization by half or more, which is looking quite possible with non-Intel Silicon.
Intel's secret sauce is inertia.
The thought that Intel's is not challengeable, and the world doesn't need a company to dethrone it either.
But that assumption is no longer true, and the counter movement is in its full swing.
The future of computing is on not CPU if you ask me. It would move from general computing to heterogeneous computing, and possibly application-specific chips/FPGA. MKL is fast, probably, but GPU and ASIC would be even faster.
That's not how latency works and there is nothing too special about Intel's TBB library. It is a big bloated group of libraries that doesn't actually contain anything irreplaceable. Don't be fooled by marketing or people that haven't looked under the hood. It should also work on amd cpus.
> Raytracing? Intel embree.
Embree is a cool convenience, but also doesn't marry anyone to intel cpus.
1) the "outside" guy (sales, know the customer)
2) the "inside" guy (operations, now the employees)
3) the "tech" guy
Any of these three can run the company, but whichever one it is, they need to have the other two near at hand, and they need to listen closely to them. The problem comes when, as at Intel and perhaps also at Boeing, you have options (1) or (2) in charge, and they're not listening to the person who is position (3) in the triumvirate, or they don't have a triumvirate at all. If the person in position (3) is in charge (as at AMD currently), they will still need to have experts in (1) and (2), and they will need to listen to them.
Gelsinger earned a master's degree from Stanford University in 1985, his bachelor's degree from Santa Clara University in 1983 (magna cum laude), and an associate degree from Lincoln Technical Institute in 1979, all in electrical engineering.
I'd call it an engineering background.
The only gig where you get rewarded handsomely even if you fail.
From various articles over the years it seems that what's happened to Intel internally is fairly typical: internal fiefdoms, empire-building, turf wars and the like. This is something you have to actively prevent from happening.
This is going to take someone with deep experience in fab engineering to figure out, not a bean counter. And it should probably involve a massive house cleaning of middle management.
And no the answer isn't just another reorg. Unless you actively prevent it reorgs become a semi-constant thing. Every 3-6 months you'll be told how some VP in your management chain you've never heard of let alone or met now reports to some other VP you've never heard of or met. There'll be announcements about how the new structure is streamlined and better fits some new reality. And 6 months later you'll go through the same thing.
This is a way of essentially dodging responsibility. Nothing is in place long enough for anyone to be accountable for anything working or not working.
Hopefully the next CEO is as committed to all the other stuff, like treating the employees very good and the community stuff. The best bit of the company isn't the tech at all (in my opinion)
That makes me feel old given that I first spoke with them as an analyst before ESX Server came out :-)
I'm not comparing Gelsinger to Steve Jobs in a general sense, but Jobs wasn't new to Apple when he returned -- and yet Jobs' return to leadership was transformative for the company.
When Steve Jobs returned to Apple, he literally fired most, if not all, of management consulting types. He changed the culture overnight. I don’t see that happening at Intel (but I hope I am wrong).
(that said i'm sure he's crying all the way to the bank with his millions so i'm not feeling too sorry for him)
I'd say there's a great opportunity for Nvidia to step up and cut out the middleman (Intel and AMD) and put together some great ARM based hardware running Windows on consumer oriented hardware and Linux in data centers. Given that they just bought ARM, that probably is something they are actively planning to do. Apple just showed us all that the M1 can run circles around Intel and AMD. That's a trick others can pull off too. Certainly Nvidia.
I'd say both Intel and AMD need to make a move away from X86 soon or risk being marginalized. In AMD's case, RISC-V might be a nice lateral move to make. That paired with their GPU as an SOC might be attractive for a wide range of devices. The alternative would be having to license chip designs from their main competitor and making them richer in the process. Sticking with X86 is long term a losing game. People care less than ever about binary backwards compatibility.
Granted, ARM is huge threat to Intel over the longer term, but AMD is taking market share now.
Intel/AMD split in x86 currently 61.4% / 38.6%.
If we (very incorrectly assume) 100% of Intel sales were through Apple and were going to all be replaced by Apple, then, even given current trends AMD and Apple would be at roughly 50/50 before long. Looks like OS X is around 16-17% of desktop operating system share though. So the ~82% of buyers still buying x86 are going to continue choosing between Intel and AMD.
Apple M1 has some people believing that suddenly everyone will stop buying x86 chips and buy Apple unless someone releases a competitive ARM based chip for Windows/Linux. I'd like to see some evidence for that premise, though.
Apple's messaging has always been "better designed hardware + better software experience", and yet they still haven't breached 20% market share. A CPU that increases battery life (and yes performs very well) but still can't be bought with your Windows PC isn't going to rapidly change the market share. It could erode it over time, but this is certainly just conjecture, not proof. Let's revisit the conversation in 5 years and see what Apple, Intel and AMD have done, technologically, and what consumers have decided.
If one assumes that x86 remains the dominant architecture in the industry then, yes, it's basically a zero sum market share game between Intel and AMD. But lots of people don't think that represents reality in the second half of this decade.
ARM is growing in servers, but AMD is as well. It's not clear though how either smartphone or server architecture will affect desktop/laptop purchasing for consumers en masse.
Of course if we're talking about 2025-2030, I'm sure any predictions I make are a roll of the dice, at best. But right now I don't think there's enough momentum of any players to have absolute certainly about 2025 and beyond. There is a lot of inertia with x86 in desktop/laptop, and so far Apple's Macbook Air/Pro and Mac Mini are the only high performing options on ARM.
I like AMD but I'll be happy to see any technological progress that makes significant improvements to our quality of life.
Just five years ago I didn't consider laptops viable for gaming, and now I do most of my gaming on one. The Macbooks with M1 seem like they are capable of some level of gaming, but not "max setting" 1080p gaming, so it's not yet an option for someone like me to switch. But when I'm working, I do most of that on a powerful but very quiet desktop. The M1 chip would not improve my quality of life on my desktop because the efficiency of the chip won't really change anything for me. There's no compelling reason to swap out of my custom built machine to a Mac Mini.
Anecdotes are very personal. And so are computers. For many consumers where an M1-based machine work, the benefits are all but lost on them anyway, for a variety of reasons. They don't make decisions based on CPU efficiency - just what they are used to and what features they need and want. If a feature is really life-changing for that particular person they might be OK with change, e.g. switching out operating systems.
For a developer, it's either easy to think about switching because you know everything is cross-platform, or it's perhaps impossible to switch because you use exclusive software.
All kinds of things over the years. For example, Qualcomm's Snapdragon line.
And Intel was definitely pushing to get into mobile at one point. They made a big deal about processor compatibility from mobile up through the server. I still remember at one IDF, they made a big deal about how you wanted to run Intel for mobile (this was pre-iPhone) so that Flash would run the same everywhere.
I agree that it's hard to make predictions more than a few years out and certainly x86 has a lot of inertia. On the other hand, there's a lot more abstraction than there used to be and we know there's going to be a lot heterogeneity anyway (GPU, DPU, TPU, FPGA, SIMD instructions, etc.) given the slowing down of CMOS process scaling. So I don't think it's too big a stretch to imagine that we'll see a more varied processor landscape. (I expect ARM to gain share although I don't expect it to dominate on servers--though I have colleagues who do expect that to happen.)
Sounds like old guard thinking to me
It feels like they're throwing in the towel on being the leader, giving up on trying to catch up process wise and will look to maximize their existing revenue. RIP
It says "Dan Loeb's Third Point hedge fund in December urged Intel's board to explore "strategic alternatives."
That is typical hedge fund pressure attempting to squeeze (short term) money out of their investment. The article has two more paragraphs consisting of the hedge fund's cheap shot quotes.
It isn't clear Intel's board succumbed to the hedge fund pressure. Changing the CEO from a finance-oriented CEO to a technically-oriented CEO (Gelsinger) is taking Intel back to its roots rather than "exploring strategic alternatives." Intel was founded and lead for many years by technically-oriented CEOs.
"Strategic alternatives" often means "split up the businesses." Silver Lake proposed something similar to AMD in 2015. In the end, the Silver Lake deal didn't happen  and AMD stock is up 45x since then.
Intel's fabrication business needs more volume, so it can increase R&D spending and capital expenditure, it can get this volume as a merchant fab.
Intel's design business doesn't need to be held back by Intel's fabrication delays. I imagine designers at Intel would prefer to compete with AMD on the same playing field--TSMC's latest node.
My understanding is that Silver Lake wanted AMD to split apart its product segments. They were going to buy 20-25% of the company, but the deal never materialized.
Intel's vertical integration is an asset IMO, especially in a supply-constrained environment like the present.
Boeing made an entire line of defective airplanes that could autonomously kill everyone aboard under normal usage. Then, a respiratory virus hammered the travel industry.
In contrast, the semiconductor industry is seeing more demand than ever before, and presently undergoing a shortage. Intel has mismanaged 10nm and 7nm, but the company maintains a majority CPU market share overall and an even wider margin for servers.
On the balance of probabilities I think the Max debacle ended up doing more harm, but IMO Intel (and AMD/ARM to a lesser extent) did get off easy because of the extremely technical nature of the issue.
Edit: I also agree that the max issue has a much more direct Executive Directive -> Harm line to draw. I don't think the Intel CEO went down to engineering and said anything comparable to "create a new version of a plane that needs no new training while having completely new larger more efficient engines, even if that's physically impossible"
If they were going from a former CTO to a former CFO, I would agree. They are going to a former CTO from a former CFO. How does this make it seem like they are looking to maximize their existing revenue rather than trying to get someone in to "fix" their issues?
My goober hot take on Boeing is a bit different.
Another victim of financialization.
When McDonald Douglass reverse acquired Boeing, the balance changed. Emphasis on share price, gutting wages while doing stock buybacks, shady business practices (bribes for defense contracts, gutting oversight), and so forth.
Of course, there's always more to the story. Like I have no idea how much to blame Clinton Admin's push for consolidation and monopolies (removing competition). Or how to explain the quixotic quest to outsource and offshore core competencies.
So as casual observer, it seems like Intel similarly lost its way.
EDIT: I was a bit harsh, toned it down.
EDIT 2: This is probably petty, but I can't ignore the fact that there was a significant hubbub at Intel regarding him using the "Dr" prefix. He scrubbed it from his bio and internal pages when it was pointed it out didn't come from an accredited university and that it was honorary. He also caught flak internally for having the pope bless a wafer for Intel's future success. It was very weird, especially given the high percentage of Muslim engineers at Intel, and its focus on neutrality.
The stock jumped quite high though. Not sure if its because really bad ceo was replaced with slightly less bad one.
I'm not super sensitive to exactly who replaces Bob, though Pat seems like a decent choice having read about him now.
I will say I was ready to unload the shares if Bob was replaced with another MBA though. Having a non-engineer lead an organization like Intel was a disastrous choice and seriously makes me question the boards judgement.
Almost all universities explicitly and strongly tell people they give honorary degrees to to not do this.
Same goes for Benjamin Franklin btw, whom I see you left out of your list.
Also, what's up with Peru? Marxism?
Friendly advice from someone who's learned this the hard way: If you edit your comment so that it's tone is a bit more friendly, it's more likely to foster a real discussion.
Not sure what to make of the fact that he wrote two books on balancing work and family life, and makes me wonder if this is the general that will win the silicon war. Time will tell, I suppose.
Found this out the hard way when I tried to edit some articles about Max Stirner. They don't even like small changes by randoms...