Ticker

6/recent/ticker-posts

New to AI? Start with Ollama

 

So I’ll be honest… AI always felt confusing to me 😅

Too many tools, too many terms, and most of them need APIs or money.

Then I found Ollama… and things became much simpler.

In this blog, I’ll show you how a beginner can start using AI with Python using Ollama. Nothing complex.


What is Ollama?

Ollama is a simple tool that lets you run AI models directly on your own computer (Windows, Mac, or Linux).

Instead of doing complicated setup, it handles everything for you.
You just run one command, and models like Llama or Mistral start working.

In short:
👉 Ollama makes it easy to use AI on your laptop without any headache.

No Cloud
No API Key
No Cost

Just install and use.


Step 1: Install Ollama

Go to the official website and download Ollama.

Install it like normal software. No special steps.


Step 2: Run Your First AI

Open terminal / command prompt and run:

ollama run llama3

That’s it.

Now you can chat with AI directly in your terminal.


Use Ollama with Python

Now comes the interesting part 👇

Ollama runs on your system as a local server.

So you can call it from Python like an API.


Simple Python Example

Install requests if you don’t have it:

pip install requests

Now use this code:

import requests

def ask_ollama(prompt):
    url = "http://localhost:11434/api/generate"

    data = {
        "model": "llama3",
        "prompt": prompt,
        "stream": False
    }

    try:
        response = requests.post(url, json=data)
        response.raise_for_status()
        return response.json()["response"]
    except Exception as e:
        return f"Error: {e}"


# simple loop (chat-like)
while True:
    user_input = input("You: ")

    if user_input.lower() == "exit":
        break

    answer = ask_ollama(user_input)
    print("AI:", answer)

Run this… and you’ll get an AI response in your Python console.


How This Works?

  • Ollama runs locally on your machine
  • Python sends a request
  • AI processes it
  • You get a response

Just like an API… but running on your laptop.


What You Can Build?

Once this is working, you can build simple things like:

  • Chatbot
  • Text summarizer
  • Q&A system
  • Smart input box (understands user text)

Start small… don’t overthink.


Things to Keep in Mind

  • First time model download is big (few GB)
  • Needs decent RAM (8GB recommended)
  • Response may be slower than online AI

But still… for learning, it’s amazing.


Final Thoughts

If you’re a beginner, don’t jump into complex AI tools.

Start simple.

Ollama + Python is one of the easiest ways to understand how AI actually works.

No cost.
No pressure.
Just try and learn.

Post a Comment

0 Comments