For those interested in more, I found this People Make Games video to be super interesting. https://youtu.be/lYaDXZ2MI-k?si=pxrioKZq_S7ka9il
For those interested in more, I found this People Make Games video to be super interesting. https://youtu.be/lYaDXZ2MI-k?si=pxrioKZq_S7ka9il
Ohioan here. You’re not wrong. Sorry about JD Vance.
For me and mine, it’s carrots. Do you know how difficult it is to find carrot-free items? Impossible.
Devastating loss for the science community. I used this database in my PhD, and didn’t expect it to shut down ever.
Agreed, seems like a no-brainer. Typically this stuff is handled at an institutional level, with bad professors losing/ failing to achieve tenure. But some results have much bigger implications than just “Uh oh, I cited that paper and it was a bad one.” Often, entire clinical pipelines are developed off of bad research, which wastes millions of dollars.
See also, the recent scandals in Alzheimer’s research. https://www.science.org/content/article/potential-fabrication-research-images-threatens-key-theory-alzheimers-disease
To be fair, my job involves very sensitive medical data. We’ve seen entire businesses shut down because of data breaches.
I’m 100% so far at my job, but we had one test that tricked somewhere around 30% of employees. They spoofed everyone’s supervisor and made it look like an urgent Teams message was pending.
Usually, if you get phished you lose your bonus. They made an exception that one time.
In grad school I worked with MRI data (hence the username). I had to upload ~500GB to our supercomputing cluster. Somewhere around 100,000 MRI images, and wrote 20 or so different machine learning algorithms to process them. All said and done, I ended up with about 2.5TB on the supercomputer. About 500MB ended up being useful and made it into my thesis.
Don’t stay in school, kids.
Nah. Fenced epee for a bit in a college club. Height advantage was pretty great. I guess it just depends on the weapon.
Grim Dawn Item Assistant is your best friend. While you’re at it, Rainbow Item Names (or whatever it’s called).
Well if you liked PoE I doubt you’ll like D4. It’s a much simpler game. Sadly my only advice is to try GD and Last Epoch again. I’ve got hundreds of hours in the former and I just got 10 hours into the latter.
Last Epoch feels like a more approachable PoE. I thoroughly enjoy how the skills interplay with one another, but I still prefer the itemization in Grim Dawn.
The only reason I’m not playing GD currently is because I have too many QoL mods installed so my cloud saving doesn’t work, but I can cloud save for Last Epoch for my steam deck lmao.
They raised my rent 20% over two years and priced me out of two apartments. Glad to see progress.
It’s been in development for a while: https://ieeexplore.ieee.org/abstract/document/1396377?casa_token=-gOCNaYaKZIAAAAA:Z0pSQkyDBjv6ITghDSt5YnbvrkA88fAfQV_ISknUF_5XURVI5N995YNaTVLUtacS7cTsOs7o
Even before the above paper, I recall efforts to connect (rat) brains to computers in the late 90s/early 2000s. https://link.springer.com/article/10.1023/A:1012407611130
It’s a bunch of neurons that speak to a computer with a microelectrode array. So they “speak to” the neurons with electric impulses, and then “listen to” what they have to say. The computer it’s connected to uses binary, but the neurons are somewhere in between. Yes, the change in electrical potential is analog, but neurons are typically in their “on” state, recovering from their “on” state, or just chilling out.
The brain is incredible because of the network of connections between neurons that store information. It’ll be interesting to see if a small scale system like this can be used for anything larger scale.
Believe it or not, I studied this in school. There’s some niche applications for alternative computers like this. My favorite is the way you can use DNA to solve the traveling salesman problem (https://en.wikipedia.org/wiki/DNA_computing?wprov=sfla1)
There have been other “bioprocessors” before this one, some of which have used neurons for simple image detection, e.g https://ieeexplore.ieee.org/abstract/document/1396377?casa_token=-gOCNaYaKZIAAAAA:Z0pSQkyDBjv6ITghDSt5YnbvrkA88fAfQV_ISknUF_5XURVI5N995YNaTVLUtacS7cTsOs7o. But this seems to be the first commercial application. Yes, it’ll use less energy, but the applications will probably be equally as niche. Artificial neural networks can do most of the important parts (like “learn” and “rememeber”) and are less finicky to work with.
My job is 8:30 - 5 with a 30 minute lunch break. So almost.
But, we also get 2 days/week at home, and can flex time as required. Tons of international work, so the flexible hours are a godsend when time zones are against us.
It’s a salaried position and depending on your supervisor and stage of your career, you’re expected to work 40-45 hours a week. Deadlines and ugly projects tend to increase hours work. I’m very lucky, as my industry can be pretty brutal with sudden ends to projects and unexpected layoffs.
Thanks for the recommendation, I was worried they would be missing some of my artists but they had 99% of my music. Can’t wait to ditch Spotify.
ETA: dear lord the sound quality is so much better. I had no idea what I was missing.
We’ve got some really good theories, though. Neurons make new connections and prune them over time. We know about two types of ion channels within the synapse - AMPA and NMDA. AMPA channels open within the post-synapse neuron when glutamate is released by the pre-synapse neuron. And the AMPA receptor allows sodium ions into the dell, causing it to activate.
If the post-synapse cell fires for a long enough time, i.e. recieves strong enough input from another cells/enough AMPA receptors open, the NMDA receptor opens and calcium enters the cell. Typically an ion of magnesium keeps it closed. Once opened, it triggers a series of cellular mechanisms that cause the connection between the neurons to get stronger.
This is how Donald Hebb’s theory of learning works. https://en.wikipedia.org/wiki/Hebbian_theory?wprov=sfla1
Cells that fire together, wire together.
Actually, neuron-based machine learning models can handle this. The connections between the fake neurons can be modeled as a “strength”, or the probability that activating neuron A leads to activation of neuron B. Advanced learning models just change the strength of these connections. If the probability is zero, that’s a “lost” connection.
Those models don’t have physical connections between neurons, but mathematical/programmed connections. Those are easy to change.
Jfc just spent 15 minutes trying to cancel a newspaper subscription this morning. Shame I couldn’t wait six months to do so.