• 0 Posts
  • 55 Comments
Joined 3 months ago
cake
Cake day: April 10th, 2025

help-circle
  • I quite literally yelled at the introduction of ‘the cloud’ as yet another stupid corpo buzzword.

    I was working at MSFT the first time someone hsd ever asked me if I had a ‘cloud’ backup.

    What? Do you mean a remote server, offsite?

    No, no, in the cloud!

    5 minutes of research later.

    Oh, so yes, you do mean on a remote server somewhere.

    No, no, in the cloud!

    head_desk.jpg





  • I’ve done similar things in a coffee shop before, just working on my own code, and I have actually been ‘politely’ asked to leave by the staff.

    The staff evidently being a bunch of morons who thought I was… hacking into … something?

    They didn’t know what, but they were very concerned.

    I was unable to convince them I was not, because ‘terminal’ = ‘hacking’ to idiots who only know anything about computers via movies and tv shows.


  • quite ironically, they are using syntax, specifically / , to indicate a specific kind of meaning afterward.

    /sarcasm

    /s

    /joking

    /j

    I’ve seen all these used to more explicitly indicate that the previous statement was sarcastic, or a joke, due to irony being largely dead, but also to help with people may not natively read/speak/write english.



  • sp3ctr4l@lemmy.dbzer0.comtoProgrammer Humor@programming.devAbsolutely Legend
    link
    fedilink
    English
    arrow-up
    66
    arrow-down
    6
    ·
    edit-2
    2 days ago

    Is this supposed to be a joke or have we truly gotten to the point where … coding in a terminal via like hyprland or w/e, without relying on an what is basically an annoying tutorial character from a video game that acts as an assistant…

    This is psycopathy?

    Having actual competence in one’s field?

    Oh god we’re all doomed, they’ll soon be alternating between worshipping us demigods, or burning us at the stake.





  • sp3ctr4l@lemmy.dbzer0.comtoMemes@lemmy.mlWhat kind of man you want?
    link
    fedilink
    English
    arrow-up
    47
    ·
    edit-2
    6 days ago

    Love… is a burnin’ thing…

    And it makes… a fiery ring.

    Bound… by wild desire…

    I fell into a ring of fire.

    The taste… of love is sweet…

    When hearts… like ours meet.

    I fell for you like a child…

    Ooooh, but the fire went wild.

    https://www.biography.com/musicians/johnny-cash-june-carter-love-story-relationship

    Johnny Cash and June Carter:

    Two fucked up, rough and tumble assholes who… married and remained together, totally devoted to and thankful for each other for 35 years, died within 4 months of each other.

    Burnin’ Ring of Fire is one of the most famous songs of all time… June wrote it, Johnny sang the most famous version.

    https://youtube.com/watch?v=1WaV2x8GXj0

    Andrew Tate:

    Self described drug dealer, rapist, sex trafficker, failed MMA fighter… openly states he is disgusted by nearly all women, and only fucks them because it makes other men envious of him, also he claims to only fuck 18 and 19 yos … apparently he married someone a few months ago.

    I’m sure that’ll work out well.

    Oh right, uh, no notable discography, nor chin.

    (why do you think he has the beard)




  • So, Bazzite does have a KDE 6 variant, and works very, very well, especially on a handheld PC.

    It takes the approach of sandboxing off the core OS, but giving you a bunch of tools for running flatpaks and other things, set up DistroBox to semi-sort of have multiple linux os’s simultaneously if you want to say, compile something from source that only has proper dependencies figured out in… not Arch, what SteamOS is based on…

    I run it on my SteamDeck because it offers more ability to use it as an actual PC, while still being rock solid in gaming mode.

    But uh… for more discussion… I’m going to kind of not answer your question and suggest something else:

    Check out PikaOS.

    https://wiki.pika-os.com/en/home

    Basically, much like Nobara is a ‘gaming-tuned’, optimized, cutting/bleeding-edge version of Fedora…

    PikaOS basically is that, but for Debian.

    If you’re used to using PopOS!, well, that’s ultimately Debian based, so there may be less of a learning curve now that you’re broadly familiar with the Debian environment.

    PikaOS works with GNOME, KDE, Hyprland if you want an even lighter weight DE.

    They are also working on a Handheld PC capable out of the box distro, but its not ready yet.

    From what I’ve seen from various youtubers… PikaOS is trading blows with Cachy and Nobara for getting the highest frame rate out of a game, on a same hardware / same setting FPS comparison… sometimes it is actually beating them.

    Uh also, yeah, look into CachyOS, it seems to be the latest hotness for an Arch based, gaming optimized, but widely functional for ‘whatever’ OS, if you’re curious about trying out Arch, and of course thus being able to constantly let every one know you use Arch, actually.


  • sp3ctr4l@lemmy.dbzer0.comtoMemes@lemmy.mlIt isn't fair
    link
    fedilink
    English
    arrow-up
    26
    arrow-down
    1
    ·
    edit-2
    14 days ago

    I mean, yes, hair is genetic… but genetics also include epigenetics.

    If you take two identical twins, both prone to alopecia, give one of them 45 years of nothing but unending stress, and the other a calm and relaxing life, I’m pretty confident your stressed out twin is gonna be more bald more quickly than the other.

    But, to go back to general agreement: Yeah basically every common hairloss prevention whatever is a almost entirely a scam.

    We’re talking like… twice daily usage for months results in… 15 to 8 new hairs per sq centimeter, for rogaine/minoxidil.

    Sure, technically, it does work… but just barely, and you’d have to keep using it forever, and even if you did, if you’re prone to hairloss, you’ll hit a follicle loss rate that exceeds the regrowth rate at some point.


  • That is a prominent example of forced RT… basically, as I described with the TAA example in my other reply…

    idTech 8 seems to be the first engine that just literally requires RT for its entire render pipeline to work.

    They could theoretically build another version of it off of vulkan-base, to enable you to be able to turn RT off… but that would likely be a massive amount of work.

    On the bright side… at least the idTech engines are actually well coded, and they put a lot of time into making the engine actually very good.

    I didn’t follow the marketing ecosystem for Doom Dark Ages, but it would have been really shitty if they did not include ‘you need a GPU with RT cores’.

    On the other end of the engine spectrum:

    Bethesda… yeah, they have entirely lost control of their engine, it is mangled mess of nonsense, the latest Oblivion remaster just uses UE to render things slapped on top of Gamebryo, because no one at Bethesda can actually code worth a damn.

    Compare that to oh I dunno, the Source engine.

    Go play TitanFall 2. 10 year old game now, built on a modified version of the Portal 2 Source engine.

    Still looks great, runs very efficiently, can scale down to older hardware.

    Ok, now go play HL Alyx. If you don’t have VR, there are mods that do a decent job of converting it into M+K.

    Looks great, runs efficiently.

    None of them use RT.

    Because you don’t need to, if you take the time to actually optimize both your engine and game design.


  • I meant they also just don’t bother to optimize texture sizes, didn’t mean to imply they are directly related to ray tracing issues.

    Also… more and more games are clearly being designed, and marketed, with ray tracing in mind.

    Sure, its not absolutely forced on in too many games… but TAA often is forced on, because no one can run raytracing without temporal intelligent upscsling and frame gen…

    …and a lot of games just feed the pixel motion vectors from their older TAA implementations into the DLSS / FSR implementations, and don’t bother to recode the TAA into just giving the motion vectors as an optional API that doesn’t actually do AA…

    … and they often don’t do that because they designed their entire render pipeline to only work with TAA on, and half the games post procrssing effects would have to be recoded to work without TAA.

    So if you summarize all that: the ‘design for raytracing support’ standard is why many games do not let you turn off TAA.

    That being said: Ray tracing absolutely does only really make a significant visual difference in many (not all, but many) situations… if you have very high res textures.

    If you don’t, older light rendering methods work almost as well, and run much, much faster.

    Ray tracing involves… you know, light rays, bouncing off of models, with textures on them.

    Like… if you have a car with a glossy finish, that is reflecting in its paint the entire scene around it… well, if that reflect map that is being added to the base car texture… if that reflect map is very low res, if it is generating it from a world of low res textures… you might as well just use the old cube map method, or other methods, and not bother turning every reflective surface into a ray traced mirror.

    Or, if you’re doing accumulated lighting in a scene with different colors of lights… that effect is going to be more dramatic, more detailed, more noticable in a scene with higher res textures on everything being lit.

    I could write a 60 page report on this topic, but no one is paying me to, so I’m not going to bother.


  • Yep, thats gonna be significantly more powerful than my planned build… and likely somewhere between 500 to 1000 more expensive… but yep, that is how absurd this is, that all of that is still less expensive than a 5090 RTX.

    I’m guessing you could get all of that to work with a 750 W PSU, 850 W if you also want to have a bunch of storage drives or a lot of cooling, but yeah, you’d only need that full wattage for running raytracing in 4k.

    Does that sound about right?

    Eitherway… yeah… imagine an alternate timeline where marketing and industry direction isn’t bullshit, where people actually admit things like:

    Consoles cannot really do what they claim to do at 4K… at actual 4K.

    They use checkerboard upscaling, so basically they’re actually running at 2K and scaling up, and its actually less than 2K in demanding raytraced games, because they’re actually using FSR or DLSS as well, oh and the base graphics settings are a mix of what PC gamers would call medium and high, but they don’t show console gamers real graphics settings menus, so they don’t know that.

    Maybe, maybe we could have tried to focus on just perfecting frame per watt and frame per $ efficiency at 2K instead of baffling us with marketing bs and claiming we can just leapfrog to 4K, and more recently, telling people 8K displays make any goddamned sense at all, when in 95% of home setup situations, of any kind, they have no physically possible perceptible gains.


  • In the US, a new RTX 5090 currently costs $2899 at NewEgg, and has a max power draw of 575 watts.

    (Lowest price I can find)

    … That is a GPU, with roughly the cost and power usage of an entire, quite high end, gaming PC from 5 years ago… or even just a reasonably high end PC from right now.

    The entire move to the realtime raytracing paradigm, which has enabled AAA game devs to get very sloppy with development by not really bothering to optimize any lighting, nor textures… which has necessitated the invention of intelligent temporal frame upscaling, and frame generation… the whole, originally advertised point of this all was to make hi fidelity 4k gaming an affordable reality.

    This reality is a farce.

    Meanwhile, if you jump down to 1440p, well, I’ve got a future build plan sitting in a NewEgg wishlist right now.

    RX 9070 (220 W) + Minisforum BD795i SE (mobo + non removeable, high end AMD laptop CPU with performance comparable to a 9900X, but about half the wattage draw) … so far my pretax total for the whole build is under $1500, and, while I need to double and triple check this, I think the math on the power draw works out to a 650 Watt power supply being all you’d need… potentially with enough room to also add in some extra internal HDD storage drives, ie, you’ve got leftover wattage headroom.

    If you want to go a bit over the $1500 mark, you could fit this all in a console sized ITX case.

    That is almost half the cost as the RTX 5090 alone, and will get you over 90fps in almost all modern games, with ultra settings at 1440p, though you will have to futz around with intelligent upscaling and frame gen if you want realtime raytracing as well with similar framerates, and realistically, probably wait another quarter or two for AMD driver support and FSR 4 to become a bit more mature and properly implemented in said games.

    Or you could swap out for a maybe a 5070 (non TI, the TI is $1000 more) Nvidia card, but seeing as I’m making a linux gaming pc, you know, for the performance boost from not running Windows, AMD mesa drivers are where you wanna be.