Asynchronous logging in Python / FastAPI
Have you ever been in a situation where your server receives multiple requests at once and the log is all messed up? I have.
(lazy post, just want to get the idea out there)
  I'm a software developer with 11 years of professional experience on the tech field. Interested in working together? Contact me!
Have you ever been in a situation where your server receives multiple requests at once and the log is all messed up? I have.
(lazy post, just want to get the idea out there)
I have seen plenty of code where auth checks just weren't done. I assume this usually happens when refactoring code, the developer just forgets to put the checks back and code review fails. We can't prevent mistakes, so here's a simple idea for ensuring this does not lead to a catastrophe.
Migrating from Django to FastAPI has been quite a trip.
I'm making a game that has to simulate a lot of random events, but we only care about the sum total effect of those events. Calculating each event separately is costly. We can do better.
I spent an entire(ish) day learning to use a single sieve command. Hopefully this writeup eases your pain if you are on the same path.
Single Page Applications or SPAs have become the norm for web development. And that drives me insane.
Static sites are fast by default. It's kind of difficult to make things slow when the server only serves files and the client doesn't need to wait for another round of queries after loading the content. But how fast can we go?
I've been wanting to create a blog. I suppose it's time.
It's a funny thing. I've always felt like I have nothing to write about, yet every once in a while I run to something so ridiculous that I have to rant about it. These rants, so far, have ended up mostly in Discord (hi RPLCS). It's a shame, there's a lot of really fun stuff in there and I'm not skilled enough to dig them up.