The Ultimate Logging Guide for Python Explained with Examples and Best Practices

What will we cover?

At the end of this tutorial we will have covered the following.

  • Why do we need logging?
  • Why not just use print statements?
  • How logging works
  • Adding logging to our project

Why do we need logging?

As a starting developer you focus – and I did too – mostly on getting the program to get the job done and less on how it was being done.

Little thought was given on design and best practices.

Over time this change, and you might wonder why?

Debugging. Extending the code. Adding modules to the ecosystem.

You might only relate to the first one, debugging.

That is, your program is not doing as intended, and it might be difficult to find the bug. Yes, you add print statements all over the code to figure out how it works. If you are more advanced, you might be using the PyCharm debugger to find it.

Bottom line is, it is difficult. 

When your module become part of a bigger ecosystem, the non-intended behavior might be difficult to figure out – and you might not know where the error is.

Then adding print statements to all the modules in the ecosystem is not desirable.

While logging will not solve the problems, it will be a good tool to do that.

But logging is used for more than debugging.

  • Issue diagnosis. Your service crashes when some user does something. Well, the user might not really remember what was causing it. Then good logging can help you figuring out how to replicate the scenario in your development environment.
  • Analytics. Logs can give you information about load on your services, when and what modules are being used the most. This can help you to improve the experience for users.

We already discovered when you have bugs to catch, you will often insert print statements to see what happens. These print statements need to be removed afterwards.

That might be a lot of work. Especially, if the bug you are hunting might be part of several modules.

Also, we learned that logging is also used for issue diagnosis and analytics.

Yes, you might build your own way of making issue diagnosis and analytics, and it might work. But if you use standard modules for logging, it will integrate easy with other systems.

Don’t build your own – if there is a good standard way to do it.

This holds for logging. As we will see later – we will make an easy integration of all our logs into Grafana.

When we learn a bit more about logs, you will also realize, that logs have different levels. One level is for debugging – the lowest level – where you get a lot of information to help you find the bug. This is equivalent to adding all the print statements – and when you are done – it will automatically remove them again. All done by adjusting the debug level.

Step 2: How logging works

Logging comes in different levels.

  • DEBUG. Used to diagnose problems.
  • INFO. Confirmation on things are working as expected.
  • WARNING. Something unexpected happened or indicating some problem in the near future (but software is still working as expected).
  • ERROR. The software has not been able to perform some function as expected.
  • CRITICAL. A serious error. Program might not be able to continue running.

See the official docs here.

A simple example of how logging works is given here.

import logging
logging.warning('Watch out!')  # will print a message to the console
logging.info('I told you so')  # will not print anything

This might be strange. But the default logging level is WARNING, which means that all logging from WARNING and above (ERROR and CRITICAL) will be output.

On the other hand, if logging level is set to DEBUG – then all logging messages will be output.

This can be achieved as follows as well as writing the log to a file.

import logging
logging.basicConfig(filename='example.log', encoding='utf-8', level=logging.DEBUG)
logging.debug('This message should go to the log file')
logging.info('So should this')
logging.warning('And this, too')
logging.error('And non-ASCII stuff, too, like Øresund and Malmö')

This will output all the logging messages to the file example.log.

Step 3: Some good practices with logging

The best advice with logging is, do not add too many logs. At first you might want to have logs all over.

Use them as intended from the schema.

LevelWhen it is used
DebugDetailed information, typically of interest only when diagnosing problems.
InfoConfirmation that things are working as expected.
WarningAn indication that something unexpected happened, or indicative of some problem in the near future (e.g. ‘disk space low’). The software is still working as expected.
ErrorDue to a more serious problem, the software has not been able to perform some function.
CriticalA serious error, indicating that the program itself may be unable to continue running.

The second thing to consider is, what to log?

  • When. Always log the time – when you look at different logs form different services communicating with each other – the time stamp will help you. Also, just to identify when something happened, if it correlates with an incident.
  • Where. What application or what file is the log coming from. This is also crucial when you have logs from many modules.
  • Level. It is a great idea to have the level, it makes it easy to identify WARNINGS or similar.
  • What. Then the actual message – what happened.

There are different ways to configure the logging.

Common ways include a log configuration file or directly in the code.Here we will keep it simple and do it directly in the code.

Step 4: Adding logging to a REST API

In this tutorial we will add logging to the REST API we created here.

You can clone the code from the repository here (see how to do it in the above tutorial) or you can just follow along the code here.

The repository consists of the files:

  • README.md
  • requirements.txt
  • .gitignore
  • server.py
  • make_order.py
  • app/main.py
  • app/routers/order.py

For a description see the above tutorial.

We will setup the logger inside the main file app/main.py

import logging
from http import HTTPStatus
from fastapi import FastAPI
from .routers import order
logging.basicConfig(encoding='utf-8', level=logging.INFO,
                    format='%(asctime)s - %(name)s - %(levelname)s - %(message)s')
logger = logging.getLogger(__file__)
app = FastAPI(
    title='Your Fruit Self Service',
    version='1.0.0',
    description='Order your fruits here',
    root_path=''
)
app.include_router(order.router)

@app.get('/', status_code=HTTPStatus.OK)
async def root():
    """
    Endpoint for basic connectivity test.
    """
    logger.info('root called')
    return {'message': 'I am alive'}

The simple way to setup a logger (only done once in the code base) is to apply the basicConfig(…). Here we have setup encoding, level, and format. The format configures what to log (beside the message). It is a good idea to have the time, name, and level. The time is adding a time to when it happened, this is a crucial detail when you try to figure out when something went wrong and you compare logs from various services (which communicate with each other).

The name is the file-name, which tells you where the log originates from. As you will see next, the logger is used in various files from this module.

Finally, see the logger used in the root()-endpoint. Simply by calling logger.info(‘root called’). Adding logs like this can seem a bit overkill, but info logs like this can be crucial when you want to check if a service was running. It is common practice to have an endpoint like this and another service which calls it every minute or so. This is to check if the service is running. Now you can also investigate in the log, if the corresponding log is there for every minute (or how often you call it).

Now you have created the possibility to monitor if the service is running, and a log which can tell you the history of calls.

Now let’s explore the file app/routers/order.py.

import logging
from http import HTTPStatus
from fastapi import APIRouter
from app.routers.storage_module import Storage
logger = logging.getLogger(__file__)
router = APIRouter(tags=['income'])

@router.post('/order', status_code=HTTPStatus.OK)
async def order_call(order: str) -> Dict[str, str]:
    logger.info(f'Incoming order: {order}')
    return {'Received order': order}

Here you see, that the logger is initialized by getting the logger based on the filename (__file__). What this does, is, that if there is a logger with that filepath, then it will use it (and that is the one we created in main.py). Therefore it will inherit the same configuration.

Here we changed the print statement to an info-log.

In this project we use to name the logger after the file (__file__) (Notice, there are 2 underscores before and after file and not just one long) – this makes it easy to start with and is simple to exactly locate the file where the log comes from.

It is common to use __name__ in when building modules, as it keeps a qualified name of the module. It has the advantage over __file__, that the output is shorter and as descriptive, also, it can have logging to inherit defined logging on a lower level.

Step 5: Run it and recap

To run the server, simply run the server.py.

Then to test it, try to run the make_order.py.

This should create logs in the output of the server.py terminal.

To summarize it all.

  • What are the use cases of logging.
    • Debugging. Finding that nasty bug that is bugging you.
    • Issue diagnosis. When you get an issue and need to replicate it – then logs can help you figure out what happened.
    • Analysis. You want to know how much modules are used and by whom – then logs can be a great way.
  • As a beginner it can be tempting to use print statements – do not fall trap for that urge.
  • What are the different log-levels and how to use them.
    • Debug, info, warning, error, critical.
  • How to make simple configuration of the logger and set log-levels.
    • A simple logging will have timefilenamelevel, and message.
  • Then we added some info logs to a REST API.
  • The official Python logging guide is quite good (docs).
  • Here we have covered the basics you need to understand. Specific setup very from project to project and organizations.

Want to learn more?

Get my book that will teach you everything a modern Python cloud developer needs to master.

Learn how to create REST API microservices that generate metrics that allow you to monitor the health of the service.

What does all that mean? 

Don’t wait, get my book and master it and become one of the sought after Python developers and get your dream job.

5 Steps to Create a REST API with Python using FastAPI

What will we cover?

In this tutorial we will learn how make a REST API using FastAPI

This includes.

  • What is a REST API
  • How to install the requirements
  • Structure of files
  • How to add an endpoint to the REST API
  • How to run and test it

Step 1: What is a REST API?

You have probably heard about REST API’s (REpresentational State Transfer).

If you google it, you will probably hear about some principles it needs to fulfill. And yes, there is the formal definition, but for me it seems difficult to translate the formal definition into something you understand is a REST API.

Let’s think about it differently.

First of all – what is an API (Application Programming Interface)?

  • It enables software or modules in software to talk to each other.
  • It doesn’t have to be the same programming language.

Actually, API’s are an awesome invention. They can also be thought of a contract between software modules. Let’s say module A and B communicate (or talk) with each other through a specified API. Then you can change a module, say module B, if it still follows the API. This makes the software easier to maintain.

REST API has some additional restrictions.

When most talk about REST API’s they mean a web API where they can send HTTP verb and a URL (or URI) which describes the location of the resource.

That means a few things.

  • REST API is a client-server architecture. Like a browser (client) and webserver.
  • REST API has a URI and it is like a webserver with different pages or resources.
    • URI (Uniform Resource Identifier) is a unique sequence of characters to identify a resource (like a web server, REST API, or similar).

Funny note: Most juniors developers think they need to understand all these concepts (like REST API) in detail. In reality, most use these concepts in a vague manner and most seniors would not be able to tell all the design principles behind them.

What is the most important part about a REST API?

  • Stateless. That means the server does not know what you just have done – you need to kind of explain everything in every call you do. This makes the server logic easy – it does not need to check any history or state from the caller (or client), it knows exactly what to do from the path and parameters.
  • HTTP verbs. A REST API uses HTTP request methods. Most common are.
    • GET. Retrieves resources.
    • POST. Submits new data to the server.
    • PUT. Updates existing data.
    • DELETE. Removes data.

Some common practices in REST API’s are.

  • JSON. Most REST API use JSON to transfer the request and answers.
    • JSON (JavaScript Object Notation) is an open standard format for data exchange in a human-readable format. It is widely used and not limited to JavaScripts, as the name suggests.
  • Paths names are nouns. It is common that the paths defining the endpoints are nouns.

On this journey we are on, we will create simple resources and expose them as REST API’s.

We will only have endpoints (paths) that we use – this ensures we have a simple interface and focus on what matters to learn what we intend.

Let’s get started with our first simple REST API.

Step 2: Clone and install requirements

The easiest way to get started is by Cloning an existing structure of a project and dive into it. You will be surprised how easy it is after some inspection.

The easiest way to clone a project is to use an IDE like PyCharm.

But you can also do it from the command line in a terminal with the following command. Note that you need git installed.

git clone https://github.com/LearnPythonWithRune/fruit-service.git

This will clone this repository in a folder called fruit-service where you are located in the terminal.

Then you need to install the requirements, but before that it is a good idea to create a virtual environment.

Go to the folder of the newly cloned repository.

cd fruit-service

Now you should be located in the newly cloned repository (after executing the above command). Then create a virtual environment for Python.

python -m venv venv

This creates a virtual environment, now you need to activate it (which depends on the operating system).

If on Unix or Mac:

source venv/bin/activate

If on Windows

Scripts\activate.bat

NOTICE If you use PyCharm all of this is done for you when you create a new project. This is what makes your life easier and you don’t need to bother with all of this.

Now you need to install the requirements.

Take a look at the requirements.txt file.

uvicorn==0.17.5
fastapi==0.75.0
requests==2.27.1

This is a list of all the libraries we will use in the project.

  • uvicorn is the server to run the REST API.
  • fastapi is the framework we write the code for our REST API
  • requests we use to make Python script to call our REST API

To install all the libraries execute the following command in the virtual environment we have created.

pip install -r requirements.txt

Now we are ready to start.

Step 3: Explore the files in the project

We see the project contains some files.

  • README.md
  • requirements.txt
  • .gitignore
  • server.py
  • make_order.py
  • app/main.py
  • app/routers/order.py

We will shortly explore them.

Just to note, the venv folder contains the virtual environment and you should ignore the content in it.

README.md

The README.md is an essential guide that gives other developers a description of your GitHub project. It is written in Markdown, which is a lightweight markup language for creating formatted text using a plain-text editor.

We will not explore this file further. Simply, think of it as a description for others to understand the project.

The detail level can vary a lot, as you see here.

requirements.txt

This file is essential – it contains a list of the libraries that are needed by the project. We already installed them in previous step.

.gitignore

It tells Git which files to ignore when committing your project. For the most part, you can ignore the file as well.

The files: server.py,  app/main.py, and app/routers/order.py

These files are the basis for the REST API we will run in a moment. They are all connected together and use the FastAPI Python framework.

While the API is very simple, and could be implement using a single file, I wanted you to show you how an API could be structured in a bigger project.

Why FastAPI?

We could use other frameworks, but FastAPI is a simple to learn and understand.

In a moment we will explore it further.

make_order.py

This script calls the API.

That is, when the API is running, you can use this script to call the API.

Step 4: Explore the REST API code

First let’s look at the code in app/main.py

from http import HTTPStatus
from fastapi import FastAPI
from app.routers import order
app = FastAPI(
    title='Your Fruit Self Service',
    version='1.0.0',
    description='Order your fruits here',
    root_path=''
)
app.include_router(order.router)

@app.get('/', status_code=HTTPStatus.OK)
async def root():
    """
    Endpoint for basic connectivity test.
    """
    return {'message': 'I am alive'}

This is the main file which sets up the REST API. Notice that the name, version, description is set in the app, then it includes a route (we will explore that afterwards), then it adds a GET endpoint (the default one).

This default endpoint (called root()) does not really have any functionality. It is common practice to have one endpoint like that. The reason is to have another service calling it all the time to test if it is alive. This makes it easy to monitor if the service (the REST API) is running.

Now let’s get back to this.

app.include_router(order.router)

This adds a router to our app. This router is located in the file app/routers/order.py (you can see that from the import statement).

Let’s explore that file.

from http import HTTPStatus
from fastapi import APIRouter
router = APIRouter()

@router.post('/order', status_code=HTTPStatus.OK)
async def order_call(order: str):
    print(f'Incoming order: {order}')
    return {'order': order}

In a REST API you would place all the endpoints in the subfolder app/routers/ as this one.

You can see that it contains a router (path) of a post-call (remember the types defined in step 1) order. This call takes one argument order of type str (string).

This endpoint does not do much, it will print a statement to the terminal where it is running and return the json data {‘order’: order}, where order is the incoming argument.

Now it is time to try the REST API.

Step 5: Running and calling our REST API

There are multiple ways to run the REST API, here we have created a server.py file which sets it up, so you don’t need to remember any command lines.

If you run

python server.py

Then the server sill start.

You can call it from the Swagger docs on your local host: http://127.0.0.1:8000/docs

From here you can call it by expanding the /order and type in banana as shown here.

Then press the blue Execute. This should result in output in your terminal where you run your server.

Using the Swagger docs interactively like this, is a great way to manually test the REST API.

In case you wonder, the Swagger docs are generated automatically by the FastAPI framework.

If you want to have a Python script to call your REST API then look at the make_order.py file.

import random
import requests

banana = '🍌'
apple = '🍎'
pear = '🍐'
items = [banana, apple, pear]
# Make a random order
order = items[random.randrange(len(items))]

url = "http://127.0.0.1:8000"
response = requests.post(
    url=f'{url}/order',
    params={
        'order': order
    }
)
print(f'Status code: {response.status_code}, order: {order}')

Run it and see what happens.

Want to learn more?

Get my book that will teach you everything a modern Python cloud developer needs to master.

Learn how to create REST API microservices that generate metrics that allow you to monitor the health of the service.

What does all that mean?

Don’t wait, get my book and master it and become one of the sought after Python developers and get your dream job.

How to Work With JSON Responses Without Documentation

What will you learn?

You need to use this REST API which gives JSON-response – the docs are missing?

What do you do?

Everybody knows you need to write your documentation – but even large enterprise companies like Microsoft have almost no documentation on their REST API – so what do you do? In this tutorial you will learn an easy way to figure out the structure of the JSON response you get from most REST APIs.

Step 1: Use a debugger to figure out the structure of the response

Most REST API’s give responses in JSON. While JSON is a structured way to represent data, it can be difficult to figure out how it is structured.

Many REST API’s are not well documented, or are not up to date. This can make it challenging to figure out how to interpret the result.

A great way to do that is to it is to use a debugger to figure out the structure. An IDE like PyCharm does a great job at that and works out of the box after installing it (download it here use the Community version, as it is free).

Step 2: Let’s try an example

The best way to learn is by trying it out yourself.

You can clone my GitHub repository here. After you clone it, make sure to have an virtual environment with Python and install the requirements (all explained in the video).

Alternatively, you just need to install the library yahoofinancials. This is done as follows.

pip install yahoofinancials

Then create a file with the following content.

import json
from yahoofinancials import YahooFinancials

def get_statement(ticker, frequency='quartely', statement='income'):
    yahoo_financials = YahooFinancials(ticker)
    return yahoo_financials.get_financial_stmts(frequency, statement)

my_ticker = 'AAPL'
income_statement = get_statement(my_ticker)
print(json.dumps(income_statement, indent=2))

This will serve as our example. As you see, we call YahooFinancial (which strictly speaking is not a REST API, but it returns JSON).

Then we print the JSON-response.

This is a simple example, but below we see the output, then we will dive into how get the data we want.

{
  "incomeStatementHistoryQuarterly": {
    "AAPL": [
      {
        "2022-06-25": {
          "researchDevelopment": 6797000000,
          "effectOfAccountingCharges": null,
          "incomeBeforeTax": 23066000000,
          "minorityInterest": null,

Step 3: Use the debugger to find the structure

Many JSON responses are more complex than the above (which is only a small sample of the output).

A great way to find the structure is to use the debugger, by setting a break point.

Mark a break-point at the print statement and press debugger.

As you see, you can unfold the structure of the JSON response.

This shows that you have a dictionary with the ticker name, then it follows a list of income statements. Each income statement has a structure too (not unfolded in the example).

Step 5: Extract the values the correct way

The best way to get values from a dictionary is by using get. This will give you the chance to give a default return value, if the key is not present in the dictionary. The best way is to return an empty dictionary {}, as the program can then proceed without failing.


Line 13 and 14 ensure the the code will run no matter if the keys are present in the JSON-response. This way the code on lines 16-18 will not fail either.

Conclusion

Using a debugger to understand the structure of a JSON response from a REST API is a great fast way to do it. Also, using the dictionary method get is a great way to avoid programs to crash, if the key is not present in a call.