six microns
Given that human chromosomes are on the order of 5 to 10 microns, I am thinking this export regulation doesn’t apply to the hobby market. This is “use the machine in a clean room” level precision.
six microns
Given that human chromosomes are on the order of 5 to 10 microns, I am thinking this export regulation doesn’t apply to the hobby market. This is “use the machine in a clean room” level precision.
I was wondering if someone would bring up search engine indexing. Google certainly has the upper hand for LLM training data with Reddit’s new API change since they have the comments anyway. This is a big reason I fear these API changes, it is very much concentrating power in the hands of already powerful companies.
I totally agree that Reddit’s motivation is probably not related to LLMs and the link I posted is more of an excuse than anything. However, I am curious what people think about data scraping and LLMs in general.
I hope cross posts are OK. But I am curious about Experienced Dev’s perspective on this as well since the question is rather technical.
Copying my opinion from the other thread in case you don’t want to look at my other thread:
My personal opinion is that high API usage fees hurt open source LLMs (e.g. GPT4All). I would rather not see this new technology monopolized by those who can pay API fees.
Hehe, I just grabbed the number off wolfram alpha’s size comparison. Wouldn’t surprise me if they are wrong, not sure where they scrape the data from. Anyway, my point stands, six microns is still stupidly small. Some dust or hair on the cutting edge and your precision is now out the window.