Senior Software Engineer and Data Scientist focused on systems design, computational geometry, and data-driven modeling. I work at the intersection of backend engineering, scientific computing, and geometry processing, with a focus on building scalable pipelines and practical tools.
- Background in Physics and Software Engineering
- Experience as Senior Software Engineer, Data Scientist, and CTO & Co-Founder
- Focus on building systems rather than isolated scripts
- Interested in:
- Distributed systems
- Geometry processing pipelines
- BIM / IFC ecosystems
- Data-intensive applications
- Infrastructure for scalable computation
- IFC data manipulation and processing
- LIDAR β BIM reconstruction pipelines
- 3D geometry reconstruction and semantic modeling
- Integration with tools like OpenCascade / IFCOpenShell
- API design and implementation (Python / Flask / FastAPI)
- Pipeline architecture and orchestration
- Data ingestion, transformation, and serialization
- Scalable system design
- Physics-based modeling approaches
- Numerical methods and regression-based calibration
- Hybrid models combining first principles + data-driven methods
- Reusable Python libraries
- Decorators, utilities, and automation tools
- Data integration (e.g. Google Sheets APIs)
- Domain-specific validation and processing tools
A selection of projects across GitHub and PyPI:
- IFC processing and manipulation pipelines
- LIDAR β BIM reconstruction workflows
- IFC Web Viewer (Flask + WebGL + Three.js)
- Google Sheets integration utilities
- Vapor pressure modeling tools
- DNA / sequence processing libraries
- Decorators and validation utilities
- CINI code validation tools
- Templates for Python package bootstrapping
I maintain several public Python packages published on PyPI, covering:
- Data integration
- Scientific modeling
- Utility tooling
- Domain-specific processing
- Languages: Python, JavaScript
- Backend: Flask, FastAPI
- Geometry: IFCOpenShell, OpenCascade (pythonocc)
- Data: NumPy, SciPy
- Visualization: WebGL, Three.js
- Cloud / Infra: AWS, GCP, Heroku
- Tooling: Git, CI/CD workflows
- Build systems, not just scripts
- Prefer composable and modular architectures
- Combine domain knowledge with engineering rigor
- Focus on maintainability, scalability, and clarity
- Treat data pipelines as first-class systems
- Scalable geometry processing systems
- Automated BIM generation from unstructured data
- Distributed content-addressable architectures
- Hybrid physical + data-driven modeling systems
- Infrastructure for large-scale data transformation


