- iAmTheTot ( @iAmTheTot@kbin.social ) 50•1 year ago
4GB is an absolute fuck ton of text. Like, solid chunk of Wikipedia would fit in there.
- sik0fewl ( @sik0fewl@kbin.social ) 13•1 year ago
If you were restricted to just 700MM words, what would you say?
- rgb3x3 ( @rgb3x3@beehaw.org ) 1•1 year ago
Probably just send them the encyclopedia brittanica
- TonyTonyChopper ( @TonyTonyChopper@mander.xyz ) 7•1 year ago
the other day I learned you can download Wikipedia and it’s something like 50 GB of text plus 50 GB of pictures
- B0rax ( @B0rax@feddit.de ) 2•1 year ago
A few years ago (~2010) I had the entire German Wikipedia on my 8gb iPod touch. It was only about 4gb in size without media.
- BruceTwarzen ( @BruceTwarzen@kbin.social ) 4•1 year ago
So you send them text that they can’t read?
- iAmTheTot ( @iAmTheTot@kbin.social ) 12•1 year ago
I don’t think you appreciate how much text you can fit into 4GB. The first entire gigabyte could be dedicated to various means of translation and explaining our language system, and you’d still have a 500 million words left after that.
- PrivateNoob ( @PrivateNoob@sopuli.xyz ) 16•1 year ago
A link with a cracked Minecraft client and an ip to join to a small server to chill with that alien.
Technically you can probably send a bunch of links, like Wikipedia etc. He “just” needs to access to it, which may or may not breach the 4GB rule.
- Joker ( @Joker@discuss.tchncs.de ) 12•1 year ago
Damn, what a cheapskate. A chance to play Minecraft with a friendly space alien and you can’t even pay for a legit copy. Probably going to give that alien a computer virus and doom us all. Don’t put this guy in charge.
- PrivateNoob ( @PrivateNoob@sopuli.xyz ) 19•1 year ago
Can agree with that, but I wanted to save the alien from having a Microsoft account.
- Joker ( @Joker@discuss.tchncs.de ) 4•1 year ago
You’ve got a point. On second thought, maybe a different game would be better.
- 30p87 ( @30p87@feddit.de ) 2•1 year ago
Or just Wurstclient so he can play cracked
- Valeena ( @Valeena@lemmings.world ) 3•1 year ago
Nah, they’re going to give the alien an Alienware !
It won’t breach the 4GB rule, but even the closest star to Sol would have at minimum latency of 4 years.
- PrivateNoob ( @PrivateNoob@sopuli.xyz ) 3•1 year ago
Yeah that’s true.
- Gnorv ( @Gnorv@feddit.de ) 3•1 year ago
Why do you assume the alien has access to the internet?
- PrivateNoob ( @PrivateNoob@sopuli.xyz ) 6•1 year ago
No reason at all. Of course probably the aliens won’t have access to the interwebz, but playing Minecraft with an alien sounded funny in my head.
- AVincentInSpace ( @AVincentInSpace@pawb.social ) English15•1 year ago
I’m surprised no one has mentioned the actual messages humanity has sent into space in the hopes that aliens will hear it since those are actually pretty cool.
I mean look at this crap. Look at how cool it looks. https://www.plover.com/misc/Dumas-Dutil/messages.pdf
(I’m on mobile otherwise I’d add a picture)
- Pantherina ( @Pantherina@feddit.de ) 5•1 year ago
https://www.plover.com/misc/Dumas-Dutil/messages.pdf
-
Again a cringe Word PDF title (why cant this program use normal Titles?)
-
Why tf didnt they just draw our numbers and instead some crazy art stuff?
-
- Pantherina ( @Pantherina@feddit.de ) 4•1 year ago
The dimension (physical height, 5’9") of an average man (blue/white)
Well thanks now they think we are all men
- EphTen ( @EphTen@lemdro.id ) English12•1 year ago
I would probably include an apology for using FAT32 and insist that we’ve made better filesystems since then.
- AlwaysNowNeverNotMe ( @AlwaysNowNeverNotMe@kbin.social ) 10•1 year ago
In_the_end.exe from limewire
- Max-P ( @Max_P@lemmy.max-p.me ) 9•1 year ago
Probably an AI model that fits in that size. It might not be our best models, but it probably would be a lot more useful to aliens than whatever we’d decide to fit on 4GB.
They’d get mostly all the inner workings of our languages and how we do conversations and generally be able to answer basic questions about humanity.
- DogMuffins ( @DogMuffins@discuss.tchncs.de ) English5•1 year ago
What ? “an AI model” is not a compression algorithm. Why give the aliens an AI trained with some wikipedia articles when you could just give them wikipedia.
- Max-P ( @Max_P@lemmy.max-p.me ) 4•1 year ago
Because an LLM is more than just data: it’s like a big network of how syllables and words go together based on some context. And that’s useful because language is how we communicate, how we connect ideas together, it’s how we share stories. It’s not just Wikipedia articles, it’s a database of relationships between words and concepts. It approximates how we think as humans.
Yes, AI is hella overhyped. Everyone wants to AI everything. But really for this particular situation, I think the model data would actually be the best precompiled database of knowledge we can possibly provide to learn about humans for the size.
No it’s not magic compression, but 4GB worth of parameters is still a lot. GPT4All has models just under 4GB. They’re not particularly impressive compared to OpenAI’s offerings, but I think you can extract a lot more practical information to do first contact out of a basic model than 4GB worth of Wikipedia. It’s extremely lossy compression, it’s never gonna spit out articles vebatim, it will hallucinate a ton of stuff.
If we had more space I’d send all the major AIs we have like Dall-E, LLaMa and GPT 4. Imagine you’re an alien, you’re presented with a keyboard and a monitor, and know nothing about us. You can use Dall-E to try random letters and words and see if the output makes sense. Maybe you find out what a cat, dog, bat, frog, apple looks like. You can then input those words in ChatGPT, and get context as to when those are used. What’s “a horse”? What’s “riding”? Put those into Dall-E, now you know what a “cat riding a horse” looks like. It can generate as many as you want, any combination. Eventually you can figure out how to ask ChatGPT if cats typically ride horses, cars, bycles, what do cats do.
Now imagine you’re a very advanced alien species that can easily process the model’s parameters. You’ve just downloaded the basics of humanity. They can map their language to our model’s parameters, and basically speak to us in our language, and translate our answers to theirs, and basically have a basic conversation.
- DogMuffins ( @DogMuffins@discuss.tchncs.de ) English1•1 year ago
Sorry chief you haven’t really explained why an AI model would be the best format.
It’s less dense than Wikipedia text. End of.
- Max-P ( @Max_P@lemmy.max-p.me ) 2•1 year ago
I’ve already explained some upsides and even given an example of how it could be used. What’s your counterarguments? What advantages would the raw wikipedia text do that would make it more useful? What assumptions are we making about our aliens’ knowledge about us? Are we looking to share our latest scientific breakthroughs, or just showing them what humanity looks like? Are we trying to send them plans on how to build a spaceship to come visit us?
Arguably if we’re looking for the perfect dataset to send them, we’d have to redo what we did with the golden discs we sent along with the Voyager probes, and carefully consider every bit of data we send and for what purpose and how we expect them to be able to process it and understand it. This is a broad philosophical discussion, I’m not looking to be right or have the best answer, I’m providing one idea, one potential answer. Everyone’s first thought is to send out as much of Wikipedia as we can. Doesn’t make for great discussion.
It’s less dense than Wikipedia text. End of.
That’s making a strong assumption that Wikipedia is already the most dense and detailed source of information we have, and that aliens are able to read and understand english, and that this is the optimal format to present our knowledge.
I’m not arguing that LLMs encode more information. They certainly don’t. That’s not the point. I’m arguing that I think it has a higher likelihood of being useful to communicate with us. That’s the first thing we want to do with an alien species: open dialogue. Language is the fabric of our entire world, and that’s what Large Language Models do: language. The model is a representation of billions of relationships between words (or tokens, to be technical) in the input, and probabilities that the output will be that other set of words/tokens. When it sees “wheel”, that signal propagates through the network and all the weights and it comes up with probabilities that it’s related to “car”, “bycicle”, “tree”, “mountain”. Does it even know what that implies? Nope. It just knows you’re much more likely to be talking about cars than trees when a wheel is involved. Billions of those relationships are encoded in an LLM along with how weak or strong that relationship is. That’s useful information especially when language and communication is involved.
If we had an LLM for ancient and long forgotten languages, we wouldn’t even need things like rosetta stones. We could keep throwing inputs at it and see what comes out and make deductions based on that. We’d also get some information and stories from the time as a bonus and side effect of those being somewhat embedded in the model in some way. But the main point is, you can give it as many inputs as you want and it’ll generate as many outputs as you asked. Way, way more than the size of the model itself. You could have an entire conversation with an AI Egyptian or something, and learn the language. Similarly, an alien could get semi fluent in english by practicing with the model as long as they need. Heck we already do this as humans: so many tips about using ChatGPT to practice and refine your presentations, papers, prepare for interviews, etc.
That’s my value proposition for shipping an AI model: language and general culture over raw scientific data.
- DogMuffins ( @DogMuffins@discuss.tchncs.de ) English1•1 year ago
Sorry mate this is so daft. It’s like you have an AI shaped hammer and you’re trying to hit every problem with it.
Just send whatever and let the Alien’s use it to train their own LLM.
- Gnorv ( @Gnorv@feddit.de ) 3•1 year ago
AIs do a lot of different things. What kind of AI do you mean?
- 30p87 ( @30p87@feddit.de ) 9•1 year ago
stable diffusion trained off of gay furry porn
- pan_troglodytes ( @pan_troglodytes@programming.dev ) English9•1 year ago
rickroll at 8k for however many seconds that is
- Hadriscus ( @Hadriscus@lemm.ee ) 1•1 year ago
Rick rolled until the heat death of the universe
- rosymind ( @rosymind@leminal.space ) 7•1 year ago
“Stay away” with various methods given to understand the meaning of the words (images, signs, numbers, sounds, etc)
- Yote.zip ( @yote_zip@pawb.social ) English6•1 year ago
- Zellith ( @Zellith@kbin.social ) 6•1 year ago
asl?
I imagine sign language would be much less information dense than text, as it would have to be pictures. And they may not even understand the sign language.
- AVincentInSpace ( @AVincentInSpace@pawb.social ) English5•1 year ago
I think they meant “age/sex/location?” which was a common first question when getting to know someone online when instant messengers as a concept were novel
- DogMuffins ( @DogMuffins@discuss.tchncs.de ) English3•1 year ago
18 f cali
- Gamma ( @GammaGames@beehaw.org ) English4•1 year ago
Undertale 😎
- toxicbubble420 ( @toxicbubble420@beehaw.org ) 4•1 year ago
an SOS
- Hadriscus ( @Hadriscus@lemm.ee ) 3•1 year ago
onenightinparis.mov
- sour ( @sour@kbin.social ) 3•1 year ago
bill wurtz history of the world
- rebul ( @rebul@kbin.social ) 2•1 year ago
We surrender.