Domino on Linux/Unix, Troubleshooting, Best Practices, Tips and more ...

alt

Daniel Nashed

ChatGPT presents derived answers as facts

Daniel Nashed – 29 May 2023 10:52:29

Depending on what you ask ChatGPT the answers are quite good.
But ChatGPT also derives information and is not always right.
For example for Lotus Script it puts together information from multiple different areas and also makes up new properties, it thinks it knows about.
Sometimes it is also mixed up with different classes.

I have tested it with various type of questions with real world more complicated questions for Lotus Script but also for LibCurl code and others.

It even makes up notes.ini variables which are not existing.
I got questions from another partner about certain parameters, which turned out to not exist. And the source of the info turned out to be ChatGPT.

It's not wrong to use ChatGPT to get an idea and it can be very helpful pointing your to solutions.
But please make sure you validate what you get back and do not take it 1:1 as ultimate truth.
Specially make sure you don't talk to me, other consultants or HCL support about information from ChatGPT in a way that implies that those parameters exist and they are validated.

One best practice is to take parameters you get and search them on your own to find them in another context.

-- Daniel
Comments

1Lars Berntrop-Bos  30.05.2023 8:24:10  ChatGPT presents derived answers as facts

Yup. I tried asking it for a bit of LotusScript code, and it made up an NotesUIdocument object method that does not exist. It took some asking to finally admit that.

So I can confirm it cannot be relied upon without firm validation.

2Vitor Pereira  30.05.2023 11:17:28  ChatGPT presents derived answers as facts

Are you sure it's not just undocumented notes.ini variables or object methods from the IBM days? ;-)

Links

    Archives


    • [HCL Domino]
    • [Domino on Linux]
    • [Nash!Com]
    • [Daniel Nashed]