Domino on Linux/Unix, Troubleshooting, Best Practices, Tips and more ...

 
alt

Daniel Nashed

 

ChatGPT presents derived answers as facts

Daniel Nashed  29 May 2023 10:52:29

Depending on what you ask ChatGPT the answers are quite good.
But ChatGPT also derives information and is not always right.
For example for Lotus Script it puts together information from multiple different areas and also makes up new properties, it thinks it knows about.
Sometimes it is also mixed up with different classes.

I have tested it with various type of questions with real world more complicated questions for Lotus Script but also for LibCurl code and others.

It even makes up notes.ini variables which are not existing.
I got questions from another partner about certain parameters, which turned out to not exist. And the source of the info turned out to be ChatGPT.

It's not wrong to use ChatGPT to get an idea and it can be very helpful pointing your to solutions.
But please make sure you validate what you get back and do not take it 1:1 as ultimate truth.
Specially make sure you don't talk to me, other consultants or HCL support about information from ChatGPT in a way that implies that those parameters exist and they are validated.

One best practice is to take parameters you get and search them on your own to find them in another context.

-- Daniel

Links

    Archives


    • [HCL Domino]
    • [Domino on Linux]
    • [Nash!Com]
    • [Daniel Nashed]