Author: Thomas

  • What is the difference between custom GPTs and Fine-Tuning a GPT?

    What is the difference between custom GPTs and Fine-Tuning a GPT?

    In November 2023 OpenAi rolled out custom GPTs.

    We’re rolling out custom versions of ChatGPT that you can create for a specific purpose—called GPTs. Anyone can easily build their own GPT—no coding is required.

    So the question arises.

    What is the difference between custom GPTs and Fine-Tuning?

    • Custom GPTS are all based on the same AI model that remains unaltered. You « just » give the GPT instructions and documents, they can be modified at anytime to update the GPT to your convenience. Basically you are uploading some PDFs to improve answers.




      Imagine giving a new employee some print outs in order to better answer some basic client phone calls. These manuals might help answer some specific question but will not train your new employee.

    • Fine-tuning is giving new knowledge to an AI by retraining it. You « feed it » new data that will now be part of the AI, therefore changing the core of that specific AI. The whole model needs to be retrained. You are looking at costs of thousands.


      It’s like sending a person someone to some course or university. It will deeply alter that persons knowledge and understanding.

    So in the second case the AI is modified in its core, while in the first case it is all about providing instructions to guide the existing AI. Fine-tuning is more complex and expensive, you need new quality data in the right format.

    More here: https://community.openai.com/t/custom-gpts-vs-fine-tuning-whats-the-difference/477738/2

  • Python ChatGPT Chatbot Tutorial

    Python ChatGPT Chatbot Tutorial

    How to set up your own Chatbot interface with Python in 3 minutes?

    1. Get an OpenAI account and get your API key

      https://platform.openai.com
    2. Install Python and the OpenAI Python library
    pip install openai

    3. Run Python in Visual Studio Code (or wherever)

    import openai
    openai.api_key = "sk-YOUR_KEY_COMES_HERE_MATE"
    
    def chat_with_gpt(prompt):
        response = openai.chat.completions.create(
            model = "o1-mini",
            messages=[
                {"role": "assistant", "content": "You are a helpful assistant." },
                {"role": "user", "content": prompt}
                ],
            #temperature = 0.7,
        )
        return response.choices[0].message.content.strip()
    
    if __name__ == "__main__":
        while True:
            user_input = input("Tom: ")
            if user_input.lower() in ["quit", "exit", "bye"]:
                break
            
            response = chat_with_gpt(user_input)
            print("AIgnostic: ", response)

    You can specify any model you would like to test.

  • Hello world!

    Hello world!

    The AIgnostic – They don’t just follow AI; they philosophize about its societal impact with an almost religious fervor.

    This is my notebook on the sociological, socioeconomic and technical impact of AI.

    I can only assume the next few years will be a wild ride…