this post was submitted on 01 Jul 2023
10 points (85.7% liked)

Japanese Language

1403 readers
1 users here now

ようこそJapaneseLanguageへ! 日本語に興味を持てば、どうぞ登録して勉強しましょう!日本語に関係するどのテーマ、質問でも大歓迎します。 This is a community dedicated to the Japanese language. Feel free to come in and ask questions or post your thoughts and opinions about this beautiful language.

Feel free to check out the web archive of r/LearnJapanese's resources if you're looking for more learning material or tools to aid you in your Japanese language journey!

—————————

Remember that you can add furigana to your posts by writing ~{KANJI|FURIGANA}~ like:

~{漢字|かんじ}~ which comes out as:

{漢字|かんじ}

founded 1 year ago
MODERATORS
 

When I mean learning I do not mean translating big sentences or something like that, I mean things like: "Is this japanese sentence correctly written? 雲の日が好きです" In this example it answered: "

Yes, the Japanese sentence "雲の日が好きです" is correctly written. It translates to "I like cloudy days" in English. Here's a breakdown of the sentence:

  • 雲 (くも/kumo): This means "cloud."
  • の (no): This is a possessive particle that indicates the relationship between "cloud" and "day" in this context.
  • 日 (ひ/hi): This means "day."
  • が (ga): This is a subject marker particle that marks "cloudy day" as the subject of the sentence.
  • 好きです (すきです/suki desu): This means "like" or "to like" and is a polite way of expressing one's preference. The verb 好き (suki) means "to like," and です (desu) is a polite copula used to indicate politeness and formality.

Therefore, the sentence "雲の日が好きです" correctly conveys the meaning "I like cloudy days" in Japanese.

"

Is this correct,do you consider it reliable?

Sometimes I also ask some questions like: "What function does 'こと' grammatically fulfill on this sentence? .... "

you are viewing a single comment's thread
view the rest of the comments
[–] RightHandOfIkaros@lemmy.world 6 points 1 year ago* (last edited 1 year ago) (1 children)

In my experience, ChatGPT gives wrong or incorrect information when asked about the specifics of literally any subject.

For example, if you ask ChatGPT about the "Piano Puzzle" in Silent Hill 1, it will reply that the puzzle exists. Interestingly, if you ask it about the poem that coincides with this puzzle by asking it to recite the poem by its name only: "A Tale of Birds Without A Voice," I was surprised to see ChatGPT correctly recited the poem. ChatGPT forgot to include the lines at the end of the poem starting with the Crow, and while it correctly provided the answer to the riddle, in the correct order, but it did not include the Crow.

However, it also invented steps to solve the puzzle. It said the piano keys are numbered, and just said "press the key corresponding to [bird name]," which is not helpful with regards to the puzzle solution. The piano keys are not infact numbered, and the puzzle solution requires the player find which piano keys make no sound, and press those keys in the correct order corresponding to the order of the birds that the poem describes.

Now, I asked ChatGPT to analyze the specific language used in the riddle poem. This poem requires the reader have prior knowledge of the color of the birds feathers to determine the color of the piano keys each bird means. I asked ChatGPT whether this prior knowledge was bad for riddle design, and it responded with "It can be a drawback, here are points to consider." It then listed four points (accessibility, fairness, clarity and guidance, immersion and engagement) and explained how each of those relate to how a puzzle is designed (in a generic sense, not specific). It concluded in classic AI fashion by rewording its first paragraph and then saying "Ideally, puzzles should provide players with the necessary information or clues within the game world to encourage exploration, deduction, and problem-solving. This allows for a more inclusive and engaging puzzle experience." Which is basically buzzword diarrhea.

All in all I think while ChatGPT is a great tool for creative exercises and as a suggestion tool for world building and other creative facets, anything that relies on being factual or correct should not be relying on ChatGPT. It simply provides too much wrong information too often. And while it sometimes gets things correct, a lie is best told between two truths.

[–] hierophant_nihilant@reddthat.com 1 points 1 year ago (1 children)

Lol, did you make this reply with ChatGPT or "All in all" is just a nod towards text generated by it? Overall, your comment is hilarious XD

[–] RightHandOfIkaros@lemmy.world 2 points 1 year ago

I didn't use it to write the comment nor was it intentional, but I will acquiesce that it does sound like something ChatGPT might say. The most important point I was trying to make still stands though. ChatGPT is really bad as a reliable source of information, and anyone expecting it to give factual information about everything they ask it is a moron.