![](https://feddit.nl/pictrs/image/fbe0f62f-ff90-4ed8-86b8-d74abad301e4.png)
![](https://lemmy.world/pictrs/image/a8207a32-daa2-4b31-aab4-2d684fc94d18.png)
Ah the sweet sounds of a simpler, worryfree time …
Ah the sweet sounds of a simpler, worryfree time …
most PCs by that time had built-in MIDI synthesizers
Built-in? You had AdLib cards for FM synthesis, but they were never built-in and most PCs didn’t even have them. Adlib cards used the Yamaha OPL2 or OPL3 chip.
Along came Creative Labs with their AWE32, a synthesizer card that used wavetable synthesis instead of FM
You are skipping a very important part here: cards that could output digital audio. The early Soundblaster cards were pioneers here (SB 1.0, SB 2.0, SB Pro, SB16). The SB16 for example was waaaaay more popular than the AWE32 ever was, even if it still used OPL3 based FM synth for music. It’s the reason why most soundcards in the 90s were “Soundblaster compatible”.
Digital audio meant that you could have recorded digital sound effects in games. So when you fired the shotgun in Doom to kill demons, it would play actual sound effects of shotgun blasts and demon grunts instead of bleeps or something synthesized and it was awesome. This was the gamechanger that made soundcards popular, not wavetable.
The wavetable cards I feel were more of a sideshow. They were interesting, and a nice upgrade, especially if you composed music. They never really took off though and they soon became obsolete as games switched from MIDI based audio to digital audio, for example Quake 1 already had its music on audio tracks on CD-ROM, making wavetable synthesis irrelevant.
BTW, I also feel like you are selling FM synthesis short. The OPL chips kinda sucked for plain MIDI, especially with the Windows drivers, and they were never good at reproducing instrument sounds but if you knew how to program them and treated the chip as its own instrument rather than a tool to emulate real world instruments, they were capable of producing beautiful electronic music with a very typical sound signature. You should check out some of the adlib trackers, like AdTrack2 for some examples. Many games also had beautiful FM synthesized soundtracks, and I often preferred it over the AWE32 wavetable version (e.g. Doom, Descent, Dune)
There are basically four positions you can take about this:
I am on (2), as are most historians, and you put yourself on (1).
if it’s good enough for the majority of historians
It isn’t. Historians would love to have independent evidence of the existence and crucifixion of Jesus, but there isn’t… so most historians refrain from taking a position one way or the other. The ones that do have to make do with what little objective information they have, and the best they can come up with is: well because of this embarassing thing, it’s more likely that he did exist and was crucified than that he didn’t, because why would they make that up?
That’s rather weak evidence, and far from “proof”.
Not sure why you’d need more
Well for one because the more prominent people who have studied this have a vested interest in wanting it to be true. For example, John P. Meier, who posited this criterion of embarassment that I outlined in my previous comment, isn’t really a historian but a catholic priest, professor of theology (not history) and a writer of books on the subject.
There was a guy named Jesus that was crucified by the romans and all that. There is proof of that
There isn’t actually. The proof is basically: it’s embarassing that their cult leader got painfully crucified, so the early Christians and writers of the new testament wouldn’t have made that shit up.
Personally I find it rather unconvincing.
Pro tip: set a 40 minute timer on your phone as soon as you put the beer in the freezer.
How can air get heat saturated? i followed you thus far but its not like humidity, you can always add more heat
When the temperature of the air and temperature of the object you want to cool reach an equilibrium, no heat gets transfered anymore.
who is going to use a VPN (an internet privacy tool) from Google?
Exactly. That would be like using a web browser made by Google so they have direct access to your internet browsing history. Ridiculous!
Slashdot still exists, but it was mostly popular in the late 90s to mid 2000s.
I mean, he was still reading Slashdot, so I guess “yes”
We are talking about addresses, not counters. An inherently hierarchical one at that. If you don’t use the bits you are actually wasting them.
Bullshit.
I have a 64-bit computer, it can address up to 18.4 exabytes, but my computer only has 32GB, so I will never use the vast majority that address space. Am I “wasting” it?
All the 128 bits are used in IPv6. ;)
Yes they are all “used” but you don’t need them. We are not using 2^128 ip addresses in the world. In your own terminology: you are using 4 registers for a 2 register problem. That is much more wasteful in terms of hardware than using 40 bits to represent an ip address and wasting 24 bits.
you are wasting 24 bits of a 64-bit register
You’re not “wasting” them if you just don’t need the extra bits, Are you wasting a 32-bit integer
if your program only ever counts up to 1000000?
Even so when you do start to need them, you can gradually make the other bits available in the form of more octets. Like you can just define it as a.b.c.d.e = 0.a.b.c.d.e = 0.0.a.b.c.d.e = 0.0.0.a.b.c.d.e
Recall that IPv6 came out just a year before the Nintendo 64
If you’re worried about wasting registers it makes even less sense to switch from a 32-bit addressing space to a 128-bit one in one go.
Anyway, your explanation is a perfect example of “second system effect” at work. You get all caught up in the mistakes of the first system, in casu the lack of addressing bits, and then you go all out to correct those mistakes for your second system, giving it all the bits humanity could ever need before the heat death of the universe, while ignoring the real world implications of your choices. And now you are surprised that nobody wants to use your 128-bit abomination.
Hmm, I can’t say that I’ve ever noticed this. I have a 3950x 16-core CPU and I often do video re-encoding with ffmpeg on all cores, and occasionally compile software on all cores too. I don’t notice it in the GUI’s responsiveness at all.
Are you absolutely sure it’s not I/O related? A compile is usually doing a lot of random IO as well. What kind of drive are you running this on? Is it the same drive as your home directory is on?
Way back when I still had a much weaker 4-core CPU I had issues with window and mouse lagging when running certain heavy jobs as well, and it turned out that using ionice
helped me a lot more than using nice
.
I also remember that fairly recently there was a KDE/plasma stutter bug due to it reading from ~/.cache
constantly. Brodie Robertson talked about it: https://www.youtube.com/watch?v=sCoioLCT5_o
IPv6 = second system effect. It’s way too complicated for what was needed and this complexity hinders its adoption. We don’t need 100 ip addresses for every atom on the earth’s surface and we never will.
They should have just added an octet to IPv4 and be done with it.
I run a pihole as well, but it is a very rudimentary tool compared to browser based adblockers like uBlock origin. It can only block DNS queries, and can’t for example block ads if they are served from the same domain as the main site (i.e. youtube) or block specific elements on a page or block a specific script from running.
only this time they’ve got a decade of research behind them and maybe they get the bomb first
Maybe that’s why we’re living in the universe where this didn’t happen, because in the universe where it did, we wouldn’t exist (many worlds/anthropic principle interpretation)
At 17:00 everyone’s got a beer on their desk and by 18:00 the doors are locked and the lights are out. One Thursday a month the table is used for beer pong after work and we play card games like Exploding Kittens.
I’d rather go home at 17:00 and do all those things with my real friends, or you know, spend some quality time with my partner.
I’d say the problem with Linux is not so much with beginner users, it’s easy enough to setup a basic desktop with a web browser and some tools, but with intermediate users who know enough to be dangerous on Windows and think that makes them “advanced”, who then can’t apply their clickety clackety ways of figuring things out on Linux.
Sound typically (*) didn’t require “drivers” or any TSR though. The game had to do all the hardware control itself.
It was usually enough to set a BLASTER variable to point it at the correct IRQ, DMA and memory address, and perhaps run a program at boot to initialize the card and set volume levels, but no TSR eating up memory.
(*) Some exceptions are later soundcards of the Win 9x era that did crappy emulation of a real Soundblaster via a TSR in DOS.