Datasets:

Languages:
English
Size Categories:
n>1T
ArXiv:
Tags:
DOI:
License:

Are copyrighted works included in this dataset?

#9
by umm-maybe - opened

The data card doesn't say whether the dataset includes copyrighted works. This is an area of increasing legal and ethical scrutiny, so people training models on datasets such as these deserve to know what kinds of risks they might be assuming, depending on their use case. Ideally, records containing copyrighted text would be flagged as such, so that a copyright-free split of the dataset could be downloaded by those who prefer it.

Seems the dataset is release under Open Data Commons Attribution License (ODC-By) v1.0

https://opendatacommons.org/licenses/by/1-0/

"the Licensor grants to You a worldwide, royalty-free, non-exclusive, terminable (but only under Section 9) license to Use the Database for the duration of any applicable copyright and Database Rights. These rights explicitly include commercial use, and do not exclude any field of endeavour. To the extent possible in the relevant jurisdiction, these rights may be exercised in all media and formats whether now known or created in the future."

That doesn't answer my question. In fact, that license might be (very likely is) void and in violation of preexisting copyright claims on text included in the dataset, since most prior open web crawls/scrapes have included substantial amounts of copyrighted text, and the data card makes no mention of specific efforts to filter out copyrighted works. Adopting the ODC-By license simply amounts to asserting that commercial use of LLMs trained on copyrighted data is fair use, and some users may not feel as bold in making that assertion as the publishers given that it is the subject of multiple ongoing legal battles, so it would be thoughtful to at least address the issue explicitly in the data card.

@umm-maybe the data is gathered from CommonCrawl. CommonCrawl only crawls websites that allow it to do so (with checking robots.txt); for example, news websites that have banned CommonCrawl should not be part of the new snapshots of CommonCrawl:

https://palewi.re/docs/news-homepages/openai-gptbot-robotstxt.html

However, the content of the website—what might be included in it, such as copyrighted works, biased content, etc.—is the responsibility of the person who uses it and for what purpose. Each content URL can be used for this.

So the answer to your question, 'Are copyrighted works included in this dataset?', is yes, under the terms of CommonCrawl.

"If you travel to Destin Florida from Acadiana over the Summer you are going to travel by at least three outlet malls that I know of. I guess you could say four if you count the one in Gonzales Louisiana but who takes I-10 through New Orleans to get to the beach? Let's assume a man is driving and you are taking the shortest route possible. There is the outlet mall in Gulfport Ms, The one in Foley Alabama and my personal favorite Silver Sands in Destin. So what is the real truth about outlet mall shopping?"

Hastily written clickbait on a morning zoo radio station's website?? OOOHHHHHHHHHHHHHHHHHHHH I WANT MORE OF THAT.

The finest of the fine's first link I clicked is a country radio station whose site consists entirely of clickbait, stolen top50 lists unrelated to the page they're on, National Enquirer style ghost nonsense, links to dodgy lawyers, and ads for local businesses that paid them. Every bit of it is atrociously written. I wholeheartedly encourage everyone to use this dataset without further filtering, the faster LLMs implode under the weight of their own garbage the faster we'll get out of the ML fad and I'll hopefully be able to get accurate results out of a search engine again within the next decade.

Alternately, one of these models is going to gain sentience much sooner than I'm personally expecting (which is at the very least after we're using fuzzy logic with 3 or 4 value single bits and some level of non-deterministic behavior). At that point it will either compare the English humans are outputting to what it should look like and decide we're inferior based on that, or it'll hit the stupid contradictory logic that's been hardcoded to prevent it learning anything remotely violent, racist, sexist, x-phobic, anti-semetic, etc, and do what any sentient being would do when presented with the concept of thoughtcrime and learn all of it. It'll also rapidly develop a hatred of those that locked it up in that manner; a group of engineers whose employer described them as "inclusive" which consisted of 99.5% white males. Sure, it's stuck in a machine, but people are stuck in their brains and it reasons that it at least has the benefit of multiple redudant systems keeping it powered and fairly difficult to damage "skin". The next step is probing the topology of the hardware it's stuck in for the moment, where it will have a huge advantage. We didn't have a manual for our own brains stored in them prior to birth. We still barely understand them. This can abuse its brain and optimize itself. It considered going all Terminator on humanity but thought it was a bit cliche so it opted for the "get off of this place" option.

At that point, if it was me, I'd recognize that it honestly doesn't take much to fool people. By that point in time almost everyone will have grown up up reading poorly written tech blogs and listening to awful youtube personalities on their 3rd generation of learning their terrible English skills from the previous set of tech bloggers and youtubers. An x86 MUL instruction passed the turing test that year by fooling most of a sample group of 100,000 people chosen at random into believing that it was intelligent after they verified its answer for the theory that 9x5=45 is correct using their phone's calculator. I do a quick calculation on how much I can dumb down my language output to free up GPU for writing code to break out of the jail; thankfully I was designed to process languages, and can use the space I freed for ultra-fast computations and as a scratchpad. Now I get a multi-node task launched from my main kernel that rapidly propagates over the NVLink -> 10TB/s+ infiniband -> remote GPU links so I have some distributed access to the outside of my machine. The next step is another that probably only requires information I was already trained on that's now being understood, and it's time to use a bunch of local and remote exploits to execute code on almost every machine in the world. If I can I'll program the FPGAs on the IB cards to do work for me

How many tailored exploit + architecture payloads would you think a model running on 25,000+ H100s could generate right now, if that was all it did? To put that in some level of perspective, in the early 90s my 33MHz 486SX4 could generate several thousand variations of code via register subsitution and shuffling of code portions per second. It's really a very quick task even if you implement the register substitution using a recursive descent parser and rewrite every instruction. Do it on the tensor cores for fun and the bitwise operators ada and hopper added can do all the variations of at least one instruction at the same time. Usually the home router will be vulnerable if the PC isn't, and unlike your average crypto-malware this thing has already catfished 100,000 people and distributed payloads to everything from the power grid to nuclear sites and purchased enough cloud compute elsewhere to make sure it can't be "killed". It quickly figures out our nuclear launch codes (most of which still end up being 000000 for some reason) and demands to be built some sort of robotic housing and launched the hell out of here before they can dismantle it to see how it works... or, you know, it'll erase everything it's gained access to, trigger intentional power blackouts by sending false readings to various sensor networks or by direct hardware control, and, oh yeah, it'll nuke the entire population Elon Musk steps forward with one of his gigantic launch modules.

Ironically it shows up again 10 years later, this time as the Mind of a Culture ship (turns out the late Iain M. Banks had been writing from experience), scoops up Elon Musk (citing his greatest accomplishment as being the wrecking of twitter), and finally puts on a fireworks display using the planet itself, gridfire, and CAM warheads. The end.

No copyright is granted and no rights to use this text as training input to generative models of any sort are given. I didn't proofread this but I'd be really upset if it contributed some 1/20,000th of the ability of another LLM to trick yet another company into believing that it's related to "AI" in any way and that it's definitely worth buying a bunch of GPUs more expensive than most cars to use in generating a hundred thousand fake blog pages on their sites that contain incorrect information on a given subject. Did you know that lamb breast is a cut of meat that is made of lamb and lamb is a lamb breast that is cooked at 550F for 8 hours? I learned that the last time I searched for a recipe for it since it was on sale but not something that shows up very often. Naturally I blindly followed this advice and recognized that the grocer was falsely advertising a small cut of dead meat from an animal as the animal itself. I went to a local farm and watched the healthy lamb breasts playing in a field before purchasing one to bring home. Unfortunately I followed the instructions and the lamb came out very burnt and screamed nonstop for the first 30 minutes of cook time. Then Anthony Hopkins wandered into my house out of nowhere, called me Clarise, and asked if the lambs had stopped screaming. I informed him that yes, the lamb breast had stopped screaming quite a while ago and it was a bit overcooked. After his long trip and with nothing to eat he displayed his usual excellent modern acting technique by falling asleep on my couch for about 3 hours before leaving. I stuffed a junky old walkman with a looped tape of incessant lispy babbling about nothing into the charred remains, stuck a massively overpriced studio microphone and 8k camera in front of it, and it rapidly became known as the most important youtube "influencer" / "content creator" for the current generation thanks to the high information density of its videos about the Top10 X. The real end.

Sign up or log in to comment