Navigate to /doc Directory for Read More
Links For .md file for doc reading
- Context Manager
- Generator
- Decorator
- Higher order Function
- Loggin in python
- threading in python
- Asyncronization
- RegEX
- serialization
Logging is the "black box" flight recorder for your code. While print() tells you what’s happening now, logging tells you what happened at 3:00 AM when the server crashed while you were sleeping.
The first rule of professional Python: Stop using print() for debugging. print() is hard to turn off, hard to search, and provides no context.
| Level | Value | Use Case |
|---|---|---|
| DEBUG | 10 | Detailed info, typically of interest only when diagnosing problems. |
| INFO | 20 | Confirmation that things are working as expected. |
| WARNING | 30 | Something unexpected happened, but the app is still working. |
| ERROR | 40 | Due to a more serious problem, the software has not been able to perform some function. |
| CRITICAL | 50 | A serious error, indicating that the program itself may be unable to continue running. |
import logging
# Basic configuration
logging.basicConfig(
level=logging.DEBUG,
format='%(asctime)s - %(levelname)s - %(message)s',
filename='app.log', # Saves to a file
filemode='a' # 'a' for append, 'w' for overwrite
)
logging.info("Application started")
logging.error("Failed to connect to database")In real apps, you don't just want one log file. You want logs to go to the Console (for development) and a File (for production).
Instead of the "root" logger, we use __name__. This tells you exactly which file generated the log.
import logging
# 1. Create a logger
logger = logging.getLogger(__name__)
logger.setLevel(logging.DEBUG)
# 2. Create Handlers (Where the logs go)
console_handler = logging.StreamHandler()
file_handler = logging.FileHandler('error.log')
file_handler.setLevel(logging.ERROR) # Only save Errors to the file
# 3. Create a Formatter (What the logs look like)
formatter = logging.Formatter('%(name)s - %(levelname)s - %(message)s')
console_handler.setFormatter(formatter)
file_handler.setFormatter(formatter)
# 4. Add Handlers to the Logger
logger.addHandler(console_handler)
logger.addHandler(file_handler)
logger.info("This goes to console.")
logger.error("This goes to BOTH console and file.")For large apps, we use dictConfig. This allows you to define your entire logging strategy in a dictionary (or JSON/YAML file).
- Rotating Files: Prevents log files from becoming 50GB and crashing your server.
- JSON Logging: Makes logs "machine-readable" for tools like ELK or Datadog.
- Tracebacks: Capturing the full error stack automatically.
import logging.config
LOGGING_CONFIG = {
'version': 1,
'disable_existing_loggers': False,
'formatters': {
'standard': {'format': '%(asctime)s [%(levelname)s] %(name)s: %(message)s'},
'json': {'class': 'pythonjsonlogger.jsonlogger.JsonFormatter'}
},
'handlers': {
'default': {
'level': 'INFO',
'formatter': 'standard',
'class': 'logging.handlers.RotatingFileHandler',
'filename': 'app.log',
'maxBytes': 10485760, # 10MB
'backupCount': 5
},
},
'loggers': {
'': {'handlers': ['default'], 'level': 'DEBUG', 'propagate': True}
}
}
logging.config.dictConfig(LOGGING_CONFIG)Use Case: Tracking a single user request across multiple functions.
- Tip: Inject a
request_idinto every log so you can filter all logs for one specific failed transaction.
@app.middleware("http")
async def log_requests(request: Request, call_next):
request_id = str(uuid.uuid4())
logger.info(f"Request {request_id} started: {request.url.path}")
response = await call_next(request)
logger.info(f"Request {request_id} finished")
return responseUse Case: Auditing model decisions and resource usage.
- Tip: Log the Model Version, Input features, and Prediction Confidence. This is vital for "Explainable AI."
def predict(data):
logger.info("Inference started", extra={
"model_version": "v2.1",
"input_shape": data.shape
})
prediction = model.predict(data)
logger.info(f"Prediction made: {prediction}", extra={"confidence": 0.98})
return predictionUse Case: Performance monitoring and metadata tracking.
- Tip: Log the FPS (Frames Per Second) and the number of objects detected per frame.
import time
def process_frame(frame):
start_time = time.time()
# ... detection logic ...
fps = 1.0 / (time.time() - start_time)
if fps < 20:
logger.warning(f"Performance Drop! FPS: {fps:.2f}")
logger.debug(f"Detected {len(objects)} objects in frame.")- Never log secrets: Use filters to strip out passwords or API keys before they hit the file.
- Use
logger.exception(): Inside anexceptblock, uselogger.exception("message"). It automatically attaches the full stack trace. - Lazy Formatting: Use
logger.info("User %s", name)instead of f-stringsf"User {name}". This is faster because Python won't format the string if the log level is disabled. - Consider
Loguru: If you want a modern, "zero-config" experience, thelogurulibrary is the gold standard in 2025.
A Generator in Python is a special type of iterator that allows you to iterate over a potentially large sequence of data without loading the entire sequence into memory at once.
The key difference between a regular function that returns a list (or other iterable) and a generator function is the use of the yield keyword instead of return.
When a generator function is called, it returns a generator object (an iterator).
When a value is requested from the generator (e.g., in a for loop or by calling next()), the function executes until it hits a yield statement.
The yield statement pauses the function, saves its local state (variables and instruction pointer), and returns the yielded value.
When the next value is requested, the function resumes exactly where it left off.
This process is called lazy evaluation or on-demand generation.
These are similar to list comprehensions but use parentheses ( ) instead of square brackets [ ].
Squares of Numbers A list comprehension creates the entire list in memory:
# List comprehension (creates list [0, 1, 4, 9] immediately)
my_list = [x*x for x in range(4)]
print(f"List: {my_list}")A generator expression creates a generator object that yields values one at a time:
# Generator expression (creates generator object)
my_gen_exp = (x*x for x in range(4))
print(f"Generator Expression Object: {my_gen_exp}")
# Iterate over the generator expression
for square in my_gen_exp:
print(f"Square: {square}")Developers primarily use generators for the following use cases:
Processing huge log files or CSVs line by line without loading the entire file content into RAM.
Processing massive datasets where samples are fed to a model one batch at a time (e.g., using a generator function as a Keras data sequence).
Handling data streams (like network packets, real-time sensor readings) where the total number of items is unknown or potentially infinite.
Creating efficient, readable pipelines where the output of one generator is fed as the input to the next.
use full for log file as new log record in file
import time
# File Tailling
def follow(file):
file.seek(0,2) #go to end of file
while (True):
line = file.readline()
if not line :
time.sleep(0.1)
continue
yield line Open file and tailling it as new line enter it will print line on terminal
file=open("Generator.md")
of=follow(file=file)
print(type(of))
#itrate over file
for line in of:
print(line)
# if line[:1] == '.':
# breakA Decorator is one of Python’s most elegant features. At its simplest, a decorator is a way to "wrap" a function or a method to change or extend its behavior without permanently modifying its original code.
Think of it like adding toppings to a plain cheese pizza. You aren't changing the fundamental pizza (the function); you are just adding extra layers on top of it.
In Python, functions are "first-class objects." This means functions can be passed around as arguments to other functions, just like variables.
A decorator is just a function that takes another function as an input and returns a new function as an output.
To understand it, look at this "manual" way of doing it before we use the @ syntax:
def my_decorator(original_function):
# This inner function 'wraps' the original one
def wrapper():
print("1. Doing something before the function runs.")
original_function()
print("2. Doing something after the function runs.")
return wrapper # Return the wrapper, ready to be called
def say_hello():
print(" Hello!")
# Manual decoration
decorated_hello = my_decorator(say_hello)
decorated_hello()Python makes this easier with the @ symbol. Placing @decorator_name above a function is the exact same as saying func = decorator_name(func).
@my_decorator
def say_hello():
print(" Hello!")
say_hello()Developers use decorators to keep their code DRY (Don't Repeat Yourself). Instead of writing the same logic in ten different functions, you write one decorator and apply it to all ten.
You want to know how long a specific function takes to run.
import time
def timer(func):
def wrapper(*args, **kwargs):
start = time.time()
result = func(*args, **kwargs)
end = time.time()
print(f"Execution time: {end - start:.4f} seconds")
return result
return wrapper
@timer
def heavy_computation():
time.sleep(1.5) # Simulating work
return "Done!"
heavy_computation()In web frameworks like Flask or Django, decorators check if a user is logged in before allowing them to see a page.
def login_required(func):
def wrapper(user_is_logged_in):
if not user_is_logged_in:
print("Access Denied: Please log in.")
return
return func(user_is_logged_in)
return wrapper
@login_required
def view_secret_dashboard(user_is_logged_in):
print("Welcome to the secret dashboard!")If you are struggling to wrap your head around it, use these three mental steps:
- The Box: Imagine the decorator is a box.
- The Gift: Your original function is a gift you put inside that box.
- The Delivery: When someone asks for the "gift," they actually get the box. The box might have a bow on top (code before) or a "thank you" note inside (code after), but the gift (the function) is still there in the middle.
- Nested Functions: You must be comfortable with functions defined inside other functions.
- Returning Functions: Remember that the decorator doesn't run the function immediately; it returns a new version of it to be run later.
*argsand**kwargs: You will often see these inside decorators so that the decorator can work with any function, regardless of how many arguments that function takes.