Tranus

joined 1 year ago
 

Hi, I've spent the last few months working on a rogue-like bullet hell game for a school project. It uses a 0G movement system that's kind of similar to Asteroids, which I think makes for a surprisingly unique experience. It's finally getting to a state that feels like a real game, so this would be a great time for some playtesting! If anyone out there is interested, or just wants to play a free game, feel free to give it a try. Any feedback would be much appreciated.

[–] Tranus@programming.dev -1 points 6 months ago (1 children)

Y2K specifically makes no sense though. Any reasonable way of storing a year would use a binary integer of some length (especially when you want to use as little memory as possible). The same goes for manipulations; they are faster, more memory efficient, and easier to implement in binary. With an 8-bit signed integer counting from 1900, the concerning overflows would occur in 2028, not 2000. A base 10 representation would require at least 8 bits to store a two digit number anyway. There is no advantage to a base 10 representation, and there never has been. For Y2K to have been anything more significant than a text formatting issue, a whole lot of programmers would have had to go out of their way to be really, really bad at their jobs. Also, usage of dates beyond 2000 would have increased gradually for decades leading up to it, so the idea it would be any sort of sudden catastrophe is absurd.

[–] Tranus@programming.dev 1 points 7 months ago

It's not made from milk though, right? It wouldn't be vegan if it has any animal products. And if it isn't made from milk, it's just not cheese, even if the microorganisms are the same.

[–] Tranus@programming.dev 0 points 11 months ago (5 children)

Well letters don't really have a single canonical shape. There are many acceptable ways of rendering each. While two letters might usually look the same, it is very possible that some shape could be acceptable for one but not the other. So, it makes sense to distinguish between them in binary representation. That allows the interpreting software to determine if it cares about the difference or not.

Also, the Unicode code tables do mention which characters look (nearly) identical, so it's definitely possible to make a program interpret something like a Greek question mark the same as a semicolon. I guess it's just that no one has bothered, since it's such a rare edge case.

[–] Tranus@programming.dev 0 points 11 months ago (2 children)

Not to justify it, but you can work around this with offline mode.