Why the data returned from the chatgpt extension flow is not the same result which returned from the chatgpt website?

is there another way to integrate the chatgpt to cognigy flow ?

Didn't find what you were looking for?

New post
Was this post helpful?
1 out of 1 found this helpful



  • Hi Amr,


    Where did you find that extension? I can't find a ChatGPT extension in our docs and marketplace.


    An extension is probably just a wrapper around some Axios code, which sends a request to some API. Then the reason will be that the ChatGPT web chat in your screenshot is using a different API than your extension.

  • This is the link to the extension in the marketplace



    And I'm using the same api for openai

  • Hi Amr,

    tl;dr: ChatGPT != GPT-3

    Longer explanation: ChatGPT is an application, which is based on GPT-3.5 (next interim version of GPT-3) with some added training. It is a free beta version, and has no APIs (on purpose). So you cannot use it from within Cognigy.ai (or any other application).
    In order to get the same answers you have to do some prompt engineering - but as you do not have the same model as ChatGPT is using you still will get different results (that might be good enough for a lot of use-cases, if you do the prompt engineering right).


Please sign in to leave a comment.