I have a theory that it should have a very different “personality” (probably more like writing style) depending on language because it’s an entirely different set of training data

In English chatGPT is rather academic and has a recognisable style of writing, if you’ve used it a bit you can usually get hints something was written by it just by reading it.

Does it speak in a similar tone, with similar mannerisms in other languages? (where possible, obviously some things don’t translate)

I don’t know a second language well enough to have natural conversation so I’m unable to test this myself, and may have worded things awkwardly from a lack of understanding

  • In two languages I’m learning, German and Chinese, I’ve found it to suffer from “translationese”. It’s grammatically correct, but the sentence structure and word choice feel like the answer was first written in English then translated.

    No single sentence is wrong, but overall it sounds unnatural and has none of the “flavor” of the language. That also makes it bad for learning - it avoids a lot of sentence patterns you’ll see/hear in day to day life.

      • As a native German speaker I agree that ChatGPT is very English-flavored. I think it’s just because the sheer amount of English training data is so much larger that the patterns it learned from that bleed over into other languages. Traditional machine translations are also often pretty obvious in German, but they are more fundamentally wrong in a way that ChatGPT isn’t.

        It’s also somewhat cultural. The output you get from ChatGPT often sounds overly verbose and downright ass-kissing in German, even though I know I wouldn’t get that impression from the same output in English, simply because the way you communicate in professional environments is vastly different. (There is no German equivalent to “I hope this email finds you well”, for example.)

    • No single sentence is wrong, but overall it sounds unnatural and has none of the “flavor” of the language.

      I’ve also found that it’s often contextually wrong. Like it doesn’t know what’s going on around it or how to interpret the previous paragraph or even the previous sentence, let alone the sentence two pages back that was actually relevant to the sentence it’s now working on.

  • In English chatGPT is rather academic

    If by “academic” you mean it sounds like an undergraduate desperately trying to take up a lot of pages. It tends to waffle like crazy.

  • If you ask ChatGPT to communicate to you in a different writing style it can do a decent job of doing so. It will also respect requests to decrease verbosity and formality. The default writing style is some kind of specific configuration they have made for it, it’s not a fundamental characteristic of it.

    • This makes me wonder if they’ve written that configuration for every language though, or if the English instructions work on other languages

      I wonder if you could tell it to write like Shakespeare or something in English, then have a chat with it in Spanish and have that persist

      My guess would be that it wouldn’t transfer, otherwise it’d need to have some understanding of the words beyond just language

      • I think the misunderstanding here is in thinking ChatGPT has “languages”. It doesn’t choose a language. It is always drawing from everything it knows. The ‘configuration’ hence is the same for all languages, it’s just basically an invisible prompt telling it, in plain text, how to communicate.

        When you change/add your personalized “Custom Instructions”, this is basically the same thing.

        I would assume that this invisible context is in English, no matter what. It should make no difference.

        • I struggle to grasp how that could work though

          It’s basically just predicting what word should come next, based on many many many examples, but in very few of these examples would a conversation be across multiple languages

          Sure it’s drawing from all of its training at all times, but that training would inherently be separated

          The general explanation at least afaik is that preprompts work because it can predict what instructions would normally prompt people to respond with but there would be few or no examples to draw on of a message being sent in one language and acted on in another

  • I can’t tell apart the quality or “flavour” in English from Spanish. Spanish is my first language though, if that tells you anything. IMO the performance in those two languages is the same, with the caveat that I have used it only for generic purposes (writing resumes, rephrasing stuff)

    • If you give it instructions in English, then switch to Spanish does it continue to follow them in Spanish?

      (As in if you ask it to play the character of John the cheese merchant then ask it what its name is in Spanish, does it respond in Spanish with the correct name?)

      • I haven’t tried anything complicated, but it does switch languages when you do. I’ve only tried GPT 3.5 though, and only with prompts that “ended” in one answer (not something like asking the AI to play characters or answer in a certain way, but questions that can be answered in a single message)

  • It’s kind of weird, but it can be way more specific in Spanish. I mostly use English for programming or tech related questions; Spanish for research and when its response in English isn’t relevant or it’s less informative than I expected.