Skip to content

Uzair42/recluze_python_II

Repository files navigation

Navigate to /doc Directory for Read More

Links For .md file for doc reading

Logging Doc

Logging is the "black box" flight recorder for your code. While print() tells you what’s happening now, logging tells you what happened at 3:00 AM when the server crashed while you were sleeping.


🟢 Level 1: The Beginner (Severities & Basic Config)

The first rule of professional Python: Stop using print() for debugging. print() is hard to turn off, hard to search, and provides no context.

The 5 Standard Levels

Level Value Use Case
DEBUG 10 Detailed info, typically of interest only when diagnosing problems.
INFO 20 Confirmation that things are working as expected.
WARNING 30 Something unexpected happened, but the app is still working.
ERROR 40 Due to a more serious problem, the software has not been able to perform some function.
CRITICAL 50 A serious error, indicating that the program itself may be unable to continue running.

Simple Setup

import logging

# Basic configuration
logging.basicConfig(
    level=logging.DEBUG,
    format='%(asctime)s - %(levelname)s - %(message)s',
    filename='app.log', # Saves to a file
    filemode='a'         # 'a' for append, 'w' for overwrite
)

logging.info("Application started")
logging.error("Failed to connect to database")

🟡 Level 2: The Intermediate (Handlers & Formatters)

In real apps, you don't just want one log file. You want logs to go to the Console (for development) and a File (for production).

Creating a Custom Logger

Instead of the "root" logger, we use __name__. This tells you exactly which file generated the log.

import logging

# 1. Create a logger
logger = logging.getLogger(__name__)
logger.setLevel(logging.DEBUG)

# 2. Create Handlers (Where the logs go)
console_handler = logging.StreamHandler()
file_handler = logging.FileHandler('error.log')
file_handler.setLevel(logging.ERROR) # Only save Errors to the file

# 3. Create a Formatter (What the logs look like)
formatter = logging.Formatter('%(name)s - %(levelname)s - %(message)s')
console_handler.setFormatter(formatter)
file_handler.setFormatter(formatter)

# 4. Add Handlers to the Logger
logger.addHandler(console_handler)
logger.addHandler(file_handler)

logger.info("This goes to console.")
logger.error("This goes to BOTH console and file.")

🔴 Level 3: The Advanced (Production Grade)

For large apps, we use dictConfig. This allows you to define your entire logging strategy in a dictionary (or JSON/YAML file).

Key Advanced Features:

  • Rotating Files: Prevents log files from becoming 50GB and crashing your server.
  • JSON Logging: Makes logs "machine-readable" for tools like ELK or Datadog.
  • Tracebacks: Capturing the full error stack automatically.
import logging.config

LOGGING_CONFIG = {
    'version': 1,
    'disable_existing_loggers': False,
    'formatters': {
        'standard': {'format': '%(asctime)s [%(levelname)s] %(name)s: %(message)s'},
        'json': {'class': 'pythonjsonlogger.jsonlogger.JsonFormatter'}
    },
    'handlers': {
        'default': {
            'level': 'INFO',
            'formatter': 'standard',
            'class': 'logging.handlers.RotatingFileHandler',
            'filename': 'app.log',
            'maxBytes': 10485760, # 10MB
            'backupCount': 5
        },
    },
    'loggers': {
        '': {'handlers': ['default'], 'level': 'DEBUG', 'propagate': True}
    }
}

logging.config.dictConfig(LOGGING_CONFIG)

🚀 Field-Specific Applications

1. Web Apps (Flask/FastAPI)

Use Case: Tracking a single user request across multiple functions.

  • Tip: Inject a request_id into every log so you can filter all logs for one specific failed transaction.
@app.middleware("http")
async def log_requests(request: Request, call_next):
    request_id = str(uuid.uuid4())
    logger.info(f"Request {request_id} started: {request.url.path}")
    response = await call_next(request)
    logger.info(f"Request {request_id} finished")
    return response

2. AI Apps (Training/Mance)

Use Case: Auditing model decisions and resource usage.

  • Tip: Log the Model Version, Input features, and Prediction Confidence. This is vital for "Explainable AI."
def predict(data):
    logger.info("Inference started", extra={
        "model_version": "v2.1",
        "input_shape": data.shape
    })
    prediction = model.predict(data)
    logger.info(f"Prediction made: {prediction}", extra={"confidence": 0.98})
    return prediction

3. Computer Vision Apps (OpenCV/PyTorch)

Use Case: Performance monitoring and metadata tracking.

  • Tip: Log the FPS (Frames Per Second) and the number of objects detected per frame.
import time

def process_frame(frame):
    start_time = time.time()
    # ... detection logic ...
    fps = 1.0 / (time.time() - start_time)
    
    if fps < 20:
        logger.warning(f"Performance Drop! FPS: {fps:.2f}")
    
    logger.debug(f"Detected {len(objects)} objects in frame.")

Best Practices for 2025

  1. Never log secrets: Use filters to strip out passwords or API keys before they hit the file.
  2. Use logger.exception(): Inside an except block, use logger.exception("message"). It automatically attaches the full stack trace.
  3. Lazy Formatting: Use logger.info("User %s", name) instead of f-strings f"User {name}". This is faster because Python won't format the string if the log level is disabled.
  4. Consider Loguru: If you want a modern, "zero-config" experience, the loguru library is the gold standard in 2025.

Generator

What is a Python Generator?

A Generator in Python is a special type of iterator that allows you to iterate over a potentially large sequence of data without loading the entire sequence into memory at once.

The key difference between a regular function that returns a list (or other iterable) and a generator function is the use of the yield keyword instead of return.

When a generator function is called, it returns a generator object (an iterator).

When a value is requested from the generator (e.g., in a for loop or by calling next()), the function executes until it hits a yield statement.

The yield statement pauses the function, saves its local state (variables and instruction pointer), and returns the yielded value.

When the next value is requested, the function resumes exactly where it left off.

This process is called lazy evaluation or on-demand generation.


Generator Expressions (A concise way)

These are similar to list comprehensions but use parentheses ( ) instead of square brackets [ ].

Example 2:

Squares of Numbers A list comprehension creates the entire list in memory:

# List comprehension (creates list [0, 1, 4, 9] immediately)
my_list = [x*x for x in range(4)]
print(f"List: {my_list}")

A generator expression creates a generator object that yields values one at a time:

# Generator expression (creates generator object)
my_gen_exp = (x*x for x in range(4))
print(f"Generator Expression Object: {my_gen_exp}")

# Iterate over the generator expression
for square in my_gen_exp:
    print(f"Square: {square}")

Why Developers Use Generators

Developers primarily use generators for the following use cases:

Reading Large Files:

Processing huge log files or CSVs line by line without loading the entire file content into RAM.

Data Science/ML Pipelines:

Processing massive datasets where samples are fed to a model one batch at a time (e.g., using a generator function as a Keras data sequence).

Streaming Data:

Handling data streams (like network packets, real-time sensor readings) where the total number of items is unknown or potentially infinite.

Pipelining/Chain Iterators:

Creating efficient, readable pipelines where the output of one generator is fed as the input to the next.


File Tailling using Generator :

use full for log file as new log record in file

import time 

# File Tailling 
def follow(file):
    file.seek(0,2) #go to end of file 
    while (True):
        line = file.readline()
        if not line :
            time.sleep(0.1)
            continue 
        yield line 

Open file and tailling it as new line enter it will print line on terminal

file=open("Generator.md")
of=follow(file=file)
print(type(of))
#itrate over file
for line in of:
    print(line)

    # if line[:1] == '.':
    #  break

Decorators

Decorator in Python

A Decorator is one of Python’s most elegant features. At its simplest, a decorator is a way to "wrap" a function or a method to change or extend its behavior without permanently modifying its original code.

Think of it like adding toppings to a plain cheese pizza. You aren't changing the fundamental pizza (the function); you are just adding extra layers on top of it.


🛠️ How It Works (The Mechanics)

In Python, functions are "first-class objects." This means functions can be passed around as arguments to other functions, just like variables.

A decorator is just a function that takes another function as an input and returns a new function as an output.

The Anatomy of a Decorator

To understand it, look at this "manual" way of doing it before we use the @ syntax:

def my_decorator(original_function):
    # This inner function 'wraps' the original one
    def wrapper():
        print("1. Doing something before the function runs.")
        original_function()
        print("2. Doing something after the function runs.")
    
    return wrapper # Return the wrapper, ready to be called

def say_hello():
    print("   Hello!")

# Manual decoration
decorated_hello = my_decorator(say_hello)
decorated_hello()

The "Syntactic Sugar" (@)

Python makes this easier with the @ symbol. Placing @decorator_name above a function is the exact same as saying func = decorator_name(func).

@my_decorator
def say_hello():
    print("   Hello!")

say_hello()

🌍 Real-Life Use Cases

Developers use decorators to keep their code DRY (Don't Repeat Yourself). Instead of writing the same logic in ten different functions, you write one decorator and apply it to all ten.

1. Timing (Profiling)

You want to know how long a specific function takes to run.

import time

def timer(func):
    def wrapper(*args, **kwargs):
        start = time.time()
        result = func(*args, **kwargs)
        end = time.time()
        print(f"Execution time: {end - start:.4f} seconds")
        return result
    return wrapper

@timer
def heavy_computation():
    time.sleep(1.5) # Simulating work
    return "Done!"

heavy_computation()

2. Authorization (Security)

In web frameworks like Flask or Django, decorators check if a user is logged in before allowing them to see a page.

def login_required(func):
    def wrapper(user_is_logged_in):
        if not user_is_logged_in:
            print("Access Denied: Please log in.")
            return
        return func(user_is_logged_in)
    return wrapper

@login_required
def view_secret_dashboard(user_is_logged_in):
    print("Welcome to the secret dashboard!")

🧠 How to Understand It (The Mental Model)

If you are struggling to wrap your head around it, use these three mental steps:

  1. The Box: Imagine the decorator is a box.
  2. The Gift: Your original function is a gift you put inside that box.
  3. The Delivery: When someone asks for the "gift," they actually get the box. The box might have a bow on top (code before) or a "thank you" note inside (code after), but the gift (the function) is still there in the middle.

Key Takeaways for Beginners:

  • Nested Functions: You must be comfortable with functions defined inside other functions.
  • Returning Functions: Remember that the decorator doesn't run the function immediately; it returns a new version of it to be run later.
  • *args and **kwargs: You will often see these inside decorators so that the decorator can work with any function, regardless of how many arguments that function takes.

Contributors

Languages