It feels like scala doesn't really have types when all types implicitly coerce to everything else.
I thought we only allowed this behaviour in silly languages like JavaScript and R?
TRY TO ALWAYS LEARN SOMETHING NEW. EXPAND THAT BUBBLE OF COMFORT.
YOUR DAYS CAN FEEL FULLER BECAUSE SOMETHING DIFFERENT HAPPENS.
BEING STUCK IN A RUT AND DOING THE SAME ROUTINE CONSTANTLY MAKES TIME BLUR TOGETHER, AND MAKES IT FEEL LIKE IT'S MOVING BY TOO FAST.
STEP BACK, EXPERIMENT, READ SOMETHING, OR JUST MESS AROUND.
YOUR TIME WILL FEEL MORE SUBSTANTIAL FOR IT.
MAY YOU FIND SATISFACTION, FRIENDOS.
People like ten years ago
"Having more computational power is only a good thing, imagine what ordinary people could learn, could discover?!"
>People use it to solve fixed difficulty maths problems that create nothing of worth
Expropriate Computers From Bitcoin Miners 🏴
@kellerfuchs I'd compare it to fread from data.table, that usually wins out in tests.
Or if you're happy to convert it into a different format, Hadley's feather package is super good.
@kellerfuchs hm. I know they're compressed in memory, but honestly I don't know how /how well, sorry! All I really know is that in use I haven't hit any hard limits around that number.
@webmind @amphetamine architecture of radio
@kellerfuchs yeah i missed with language.
So compressed array equivalent in both is slice indexing? Data frame has the dplyr select and filter keywords to get similar, or just using [X, Y] for 2D lookups, and data.table does it through inbuilt syntax.
@kellerfuchs although rereading this, I'm not entirely sure if that counts as a compressed array, as I wouldn't have described it like that. Let me know if you meant differently!
@kellerfuchs R has both data.frames, which are a form of compressed arrays, and data.tables, which are compressed arrays with indexing built in for faster lookups.
A lot of the time in R I'll batch anything over about 20GB, but I've comfortably worked with 10GB of data before as long as I was being sensible in manipulation requests.
Although when I tried to do a wise to long transform/melt that ended with 22GB, it totally broke R. Ended up using rust for it, but it loaded back quickly
@kellerfuchs huh, do you not rate ggplot then?
Also I absolutely love this kind of tech. I wonder how easy it would be to code something similar and infer anything about the objects between me and the transmitters.
Or even just visualise WiFi using the same idea
It's kinda scary that there's this ring of satellites nearby, and fucking mass of towers so close. Like other directions are quiet, and then this.. https://witches.town/media/MZ9PahdGfg7ldkgN_H8
Why do they charge "convenience fees?" What do they even mean by that? Does that mean "convenient excuse to charge you more money for no reason?"
@SoniEx2 If you ported the compiler to haskell we'd have world peace in days
@brokenfingers THAT APPLIES TO BOTH CHOICES!
so like... JavaScript and R have similar things going on. they're both really ugly and a bit clunky to code in, but we put up with them because they're stellar at doing the thing they're designed for out of the box (manipulating web pages for JS, statistics for R). like R has all kinds of built-in statistical functions, native graphics, powerful substring, etc.; JavaScript has the DOM, good enough support for asynchronicity that people use Node.js for stuff that doesn't deal with web pages, etc.
... I should go to sleep and stop trash posting :')