Laboratory planner by day, toddler parent by night, enthusiastic everything-hobbyist in the thirty minutes a day I get to myself.

  • 0 Posts
  • 6 Comments
Joined 1 year ago
cake
Cake day: July 31st, 2023

help-circle

  • It’s not a coincidence that Texas is a hotbed of development for “microgrid” systems to cover for when ERCOT shits the bed – and of course all those systems are made up of diesel and natural gas generator farms, because Texans don’t want any of that communist solar power!

    I’ve got family in Texas who love it there for some reason, but there’s almost no amount of money you could pay me to move there. Bad enough when I have to work on projects in the state – contrary to the popular narrative, in my personal opinion it’s a worse place than California to try and build something, and that’s entirely to do with the personalities that seem to gravitate to positions of power there. I’d much rather slog through the bureaucracy in Cali than tiptoe around a tinpot dictator in the planning department.


  • Thrashy@lemmy.worldtoTechnology@lemmy.worldThe decline of Intel..
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    6 months ago

    The only link I am aware of is that Intel operates an R&D center in Haifa (which, it happens, is responsible for the Pentium M architecture that became the Core series of CPUs that saved Intel’s bacon after they bet the farm on Netburst and lost to Athlon 64). Linkerbaan’s apparent reinvention of the Protocols of the Elders of Zion to the contrary, the only real link seems to be that Haifa office, which exists to tap into the pool of talented Israeli electronics and semiconductor engineers.


  • Thrashy@lemmy.worldtoTechnology@lemmy.worldThe decline of Intel..
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    6 months ago

    Historically AMD has only been able to take the performance crown from Intel when Intel has made serious blunders. In the early 2000s, it was Intel commiting to Netburst in the belief that processors could scale past 5Ghz on their fab processes, if pipelined deeply enough. Instead they got caught out by unexpected quantum effects leading to excessive heat and power leakage, at the same time that AMD produces a very good follow-on to their Athlon XP line of CPUs, in the form of the Athlon 64.

    At the time, Intel did resort to dirty tricks to lock AMD out of the prebuilt and server space, for which they ultimately faced antitrust action. But the net effect was that AMD wasn’t able to capitalize on their technological edge, Ave ended up having to sell off their fabs for cash, while Intel bought enough time to revise their mobile CPU design into the Core series of desktop processors, and reclaim the technological advantage. Simultaneously AMD was betting the farm on Bulldozer, believing that the time had come to prioritize multithreading over single-core performance (it wasn’t time yet).

    This is where we enter the doldrums, with AMD repeatedly trying and failing to make the Bulldozer architecture work, while Intel coasted along on marginal updates to the Core 2 architecture for almost a decade. Intel was gonna have to blunder again to change the status quo – which they did, by betting against EUV for their 10nm fab process. Intel’s process leadership stalled and performance hit a wall, while AMD was finally producing a competent architecture in the form of Zen, and then moved ahead of Intel on process when they started manufacturing Zen2 at TSMC.

    Right now, with Intel finally getting up to speed with EUV and working on architectural improvements to catch up with AMD (and both needing to bridge the gap to Apple Silicon now) at the same time that AMD is going from strength to strength with Zen revisions, we’re in a very interesting time for CPU development. I fear a bit for AMD, as I think the fundamentals are stronger for Intel (stronger data center AI value proposition, graphics group seemingly on the upswing now that they’re finally taking it seriously, and still in control of their destiny in terms of fab processes and manufacturing) while AMD is struggling with GPU and AI development and dependent on TSMC, perpetually under threat from mainland China, for process leadership. But there’s a lot of strong competition in the space, which hasn’t been the case since the days of the Northridge P4 and Athlon XP, and that’s exciting.


  • On the one hand, I agree with you that the expected lifespan of current OLED tech doesn’t align with my expectation of monitor life… But on the other hand, I tend to use my monitors until the backlight gives out or some layer or other in the panel stackup shits the bed, and I haven’t yet had an LCD make it past the decade mark.

    In my opinion OLED is just fine for phone displays and TVs, which aren’t expected to be lit 24/7 and don’t have lots of fixed UI elements. Between my WFH job and hobby use, though, my PC screens are on about 10 hours a day on average, with the screen displaying one of a handful of programs with fixed, high contrast user interfaces. That’s gonna put an OLED panel through the wringer in quite a bit less time than I have become used to using my LCDs, and that’s not acceptable to me.


  • Through the course of my career I’ve somehow lost office space as I’ve ascended the corporate food chain. I had a private office/technician room in my first job out, then had an eight foot cubicle with high walls, then a six foot cubicle with low dividers, and then the pandemic hit. The operations guy at the last place was making noises about a benching arrangement after RTO, like people were going to put up with being elbow to elbow with Chris The Conference Call Yeller and Brenda The Lip Smacking Snacker while Team Loudly Debates Marvel Movie Trivia is yammering away the next row over.

    Hell, if it meant getting a space to myself with enough privacy to hear my own thoughts I might consider giving up my current WFH gig. But everybody’s obsessed with building awful office hellscapes and I don’t have the constitution to put up with that kind of environment.