• 1 Post
  • 56 Comments
Joined 1 year ago
cake
Cake day: June 20th, 2023

help-circle



  • The future’s wasteland will be covered by bodies of web stalkers who were naive enough to get tricked by mid-2010s shitposts.

    “Turns out they never used this to make their metal cutlery darker - who would have thought the ancients were so casually cruel?”

    “After months of research we have concluded, that despite all their technical achievements, the ancients never figured out, what does the fox say”

    “Today porf. Drobyshevsky is going to tell us about their newest work in XXI cent. anthropology - what is ‘streamer dent’ and why do we have such long heads 2300 years later?”

    “Ass, coochie and the rich - dietary practices of homo sapiens in the age of over-production”




  • Get a habit of tracking your habits. When you know everything you do while “on autopilot” and why, - you can outsource a lot of chores and work to your “autopilot” self by setting up your routines and habits correctly.

    This skill is best learned as soon as possible, and it’s a shame it’s not taught in schools. 20s is a good time - all the momentum you gain within next 20 years can carry you the rest of the way.

    Also, don’t be hard on yourself for failing. You’ll see tons of good advice - a lot of it will seem essential (like being financially responsible), for good reasons. Just know that failing at all these things does not necessarily make you a failure or a bad person. Who knows what struggles you might/will face - as long as you survive and take care of your loved ones, you should be ok. Ultimately that is all we can do.

    Also, try to engage with physical things more: people IRL next to you, touch grass, craft something with your hands. Of it’s not physical, it exists in your head, - and your head might not always be the best place to spend most your time.












  • It’s not that native UIs are lagging behind, there is a whole set of reasons.

    TL;DR: browsers, as opposed to desktop apps, are stardartized - because they were originally designed to display and deliver text documents. We were never supposed to build complex application UIs on a web stack.

    First, there is no standard way of making native UI on a desktop. Every OS uses it’s own solution, while Linux offers several different ones. Browsers rely on a set of open standards developed specifically for the web, and even there not everything works exactly the same.

    Second, browsers are designed to draw a very specific kind of UI through a very specific rendering mode - they run an immutable hierarchy of elements through layouting and painting engines. It works great for documents, but it becomes extremely unweildy for most other things, which is why we have an entire zoo of different UI implementations (crutches, most of them) for browsers.

    On the desktop we often make a choice of what UI technology would fit best our purpose. For a game engine I would use an immediate-mode UI solution like ImGUI, for the ease of prototyping, integration and fast iterations.

    For consumer software I might choose between something like QT or GTK for robust functionality, reliable performance, acessibility and community support. Mobile platforms come with their own native UI solutions.

    For data-intensive UIs and heavy editors (e.g. CAD, video and music production, games) I might need to designan entirely new rendering pipeline to comply with users requirements for ergonomics, speed, latency etc.

    It is also easy to notice that as a team or employer, it is often much easier to hire someone for web stack, than for native development. Simply put, more people can effectively code in JS, so we get more JS and tech like Electron enables that.

    If you are interested in a single solution that will get you nice results in general, no matter the platform - you might see some success with projects like Flutter or OrbTK.

    UI rendering in general is a deep and very rewarding rabbit hole. If you are in the mood, this article by Raph Levien gives a good overview of existing architectures: https://raphlinus.github.io/rust/gui/2022/05/07/ui-architecture.html



  • I am not a professional educator, but in general I think it is worth to start with basic computer literacy: identifying parts of a PC, being able to explain their overall functions, difference between hardware and software, and what kinds of software a computer can run (firmwares, operating systems, user utilities etc.). This would also be a perfect time to develop practical skills, e.g. (assuming you are a normatively-abled person) learning to touch-type and perform basic electronics maintenance, like opening your machine up to clean it and replace old thermal compounds.

    After that taking something like “Operating systems fundamentals” on Coursera would be a great way to go on.

    It really depends on your goals, resources and personal traits, as well as how much time and energy you can spare, and how do you like to learn. You can sacrifice and old machine, boot Ubuntu and break it a bunch of times. You can learn how to use virtualization and try a new thing every evening. You can get into ricing and redesign your entire OS GUI to your liking. You can get a single-board computer like RaspberryPi and try out home automation.


  • I might have phrased my thought too bluntly: I never intended to frame the problem as any sort of moral failure on the end users’ part. I view this as a failure of our educational institutions.

    In my mind, preferring to spend time on (e.g.) MS Office in class, instead of teaching proper computer literacy, is like trying to teach meal-prep with Philips air fryers instead of teaching how to cook.

    I hear you, and I too feel like it might be just my aspi-nerdiness speaking, but the same argument could be said about any subject that is considered fundamental to highschool ed. We don’t skip on philosophy, sciences, languages and arts just because they seem less applicable than math or econ, or because “it’s impossible to learn everything”.

    Our civ made progress, having invented a fundamentally new tech that is accessible to everyone and now underpins everything. Allowing people to acquire the basic literacy needed to interface with this tech sustainably is the bare minimum we should be doing. I am not talking about turning kids into cyber wizards - just getting their computing up to a level that allows them to make relevant informed choices.