• 0 Posts
  • 18 Comments
Joined 1 year ago
cake
Cake day: June 13th, 2023

help-circle



  • BedbugCutlefish@lemmy.worldtomemes@lemmy.worldInt check
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    3 months ago

    I think controller is only ‘necessary’ for souls games due to them not supporting keyboard and mouse well. I’d prefer to use keyboard for it, but all of the inputs and menu-ing is fucked up.

    Tbh, its a testament to how good the games are, that they are enjoyable despite a huge lack of QoL across the board





  • That is what I think the owner is doing here. Scamming venture capital firms for a tech that cannot work.

    And I mean, its not like I have any proof. I can’t read minds; maybe he is a true believer.

    But this company feels like those companies back in the 80s that sold tickets to mars, for the rockets they were ‘just about to build’; a scam.

    This isn’t a research firm. This isn’t trying to find the exact settings and layouts to make fusion possible. If the article can be taken at face value, this is a company to make a commercial fusion plant. And I find that, in 2023, patently absurd.


  • I hope it works.

    But I’m skeptical enough to say that I think this is a scam. We’re closing in, research wise, on getting fusion to generate more power than it takes to run. Which is awesome!

    But its still a far trek from that figure, to producing enough power to be practical (I’ve heard it said you really need to aim for 10x more production than input, minimum, for it to make any sense).

    And that is still a trek from making a fusion plant competitive with existing grid power.

    I’m skeptical if this plant they’re building will even generate power, which is like three steps away from making commercial sense at all.




  • You are, of course, correct.

    But even so, costs are costs. It doesn’t matter if you’ve achieved communism, and are in a moneyless, stateless existance, you need labor and materials to build nuclear, and labor and materials to maintain it (along with other infrastructure).

    And, I’m not anti-nuclear; it does make sense to use sometimes, in some amounts. Its just very very costly for what it provides.

    But frankly, even only accounting for current tech, wide spread nuclear just doesn’t make that much sense compared to renewables + storage and large grids interconnects.





  • Consuming content illegally is by definition a crime, yes. It also has no effect on your output. A summary or review of that content will not be infringing, it will still be fair use.

    That their use is infringing and a crime is your opinion.

    “My opinion”? have you read the headline? Its not my opinion that matters, its that of the prosecution in this lawsuit. And this lawsuit indeed alleges that copyright infringement has occurred; it’ll be up to the courts to see if the claim holds water.

    I’m definitely not sure that GPT4 or other AI models are copyright infringing or otherwise illegal. But, I think that there’s enough that seems questionable that a lawsuit is valid to do some fact-finding, and honestly, I feel like the law is a few years behind on AI anyway.

    But it seem plausible that the AI could be found to be ‘illegally distributing works’, or otherwise have broken IP laws at some point during their training or operation. A lot depends on what kind of agreements were signed over the contents of the training packages, something I frankly know nothing about, and would like to see come to light.


  • I mean, you can do that, but that’s a crime.

    Which is exactly what Sarah Silverman is claiming ChatGPT is doing.

    And, beyond a individual crime of a person reading a pirated book, again, we’re talking about ChatGPT and other AI magnifying reach and speed, beyond what an individual person ever could do even if they did nothing but read pirated material all day, not unlike websites like The Pirate Bay. Y’know, how those website constantly get taken down and have to move around the globe to areas where they’re beyond the reach of the law, due to the crimes they’re doing.

    I’m not like, anti-piracy or anything. But also, I don’t think companies should be using pirated software, and my big concern about LLMs aren’t really for private use, but for corporate use.


  • The issue isn’t that people are using others works for ‘derivative’ content.

    The issue is that, for a person to ‘derive’ comedy from Sarah Silverman the ‘analogue’ way, you have to get her works legally, be that streaming her comedy specials, or watching movies/shows she’s written for.

    With chat GPT and other AI, its been ‘trained’ on her work (and, presumably as many other’s works as possible) once, and now there’s no ‘views’, or even sources given, to those properties.

    And like a lot of digital work, its reach and speed is unprecedented. Like, previously, yeah, of course you could still ‘derive’ from people’s works indirectly, like from a friend that watched it and recounted the ‘good bits’, or through general ‘cultural osmosis’. But that was still limited by the speed of humans, and of culture. With AI, it can happen a functionally infinite number of times, nearly instantly.

    Is all that to say Silverman is 100% right here? Probably not. But I do think that, the legality of ChatGPT, and other AI that can ‘copy’ artist’s work, is worth questioning. But its a sticky enough issue that I’m genuinely not sure what the best route is. Certainly, I think current AI writing and image generation ought to be ineligible for commercial use until the issue has at least been addressed.