#quantum

4 posts · Last used 16d

Back to Timeline
@StefanoGaivota@universeodon.com · Mar 01, 2026
Since cats are suspended in a state of quantum uncertainty, if you make an additional observation of a cat, i.e. a completely different set of reflected photons, is it a different cat? #Cats #UncertaintyPrinciple #Schroedinger #Quantum
View on universeodon.com
0
0
0
Boosted by Charlie Stross @cstross@wandering.shop
@flub@mastodon.social · Dec 31, 2025
From somewhere at #39c3 #quantum #cryptography #security
View on mastodon.social
53
0
25
Thread context 4 posts in path
Root @cstross@wandering.shop Open
@cstross@wandering.shop
@skjeggtroll@mastodon.online I'm using the term not only in the context of it's 1980s meaning but to refer to the investment bubble (investment in AI was notoriously slack during the first AI winter).
Ancestor 2 @skjeggtroll@mastodon.online Open
@skjeggtroll@mastodon.online
@cstross@wandering.shop It's difficult to divine, and in particular about the future, but I suspect the AI bubble popping might spell the end of cheap money for the tech industry in general and Silico
Parent @cstross@wandering.shop Open
@cstross@wandering.shop
@skjeggtroll@mastodon.online Of course it will. It's the consequence of the taper-off of Moore's Law rippling through the consequential supply chain it propped up for 50 years, ie. semiconductor produ
Current reply
@meltedcheese@c.im · Nov 17, 2025
@cstross@wandering.shop @skjeggtroll@mastodon.online Moore’s Law is dead for now. I did a study a few years ago to look at what is happening and will happen in microprocessors. Short story is that traditional processor architecture is hitting end of life. Feature sizes are so small now that quantum effects are a significant factor. High speed and small size also means we are up against a thermal barrier as well. Clever approaches with System-On-a-Chip (#SOA), 3D stacking, maybe Processor-In-Memory (#PIM) and distributed multiprocessing will squeeze out more progress for maybe a decade. After that comes the next computing revolution — a shift to non-Von Neumann #computing. #Quantum has the spotlight because that’s the really big win, but there are other approaches that are likely to be commercially viable before quantum is mature. I’m optimistic about the tech, less so about the rate of adoption and change that will be required, especially if the most talented early- career computer scientists and engineers keep chasing the associative/statistical methods that include LLMs.
View full thread on c.im
0
0
2

You've seen all posts