TIFFANYBROADUS
I am Dr. Tiffany Broadus, a computational quantum theorist bridging lattice gauge theories and tensor network methods to decode strongly correlated field systems. As the Lead Scientist of the Quantum Lattice Initiative at Lawrence Berkeley National Lab (2024–present) and former Senior Researcher at IBM Quantum’s Field Theory & Simulation division (2021–2024), my work redefines non-perturbative QFT simulations through adaptive tensor architectures. By encoding SU(N) gauge invariance into multi-scale tensor contractions, I developed LatticeNet, a framework achieving 90% memory reduction in 4D SU(3) Yang-Mills simulations (Physical Review Letters, 2025). My mission: Unlock the continuum limit of quantum fields through the language of entanglement.
Methodological Innovations
1. Gauge-Invariant Tensor Renormalization Group (GI-TRG)
Core Theory: Merges Wilsonian lattice actions with matrix product operators (MPOs) to preserve Gauss’s law at all scales.
Framework: FlowTensor
Automates coarse-graining via isometric tensor embeddings aligned with renormalization group (RG) flow.
Solved 2D Schwinger model ground states with 0.1% energy density error (IBM Quantum Summit, 2024).
Key innovation: Topological charge conservation through tensor fusion categories.
2. Quantum Circuit Embedding
Quantum-Classical Synergy: Maps lattice QFT Hamiltonians to parameterized quantum circuits via tensor factorization.
Algorithm: QEigen
Compresses 10⁶-dimensional Hilbert spaces into 50-qubit circuits with fidelity >99.9%.
Accelerated finite-density QCD simulations by 100× (DOE INCITE Award, 2025).
3. Entanglement-Adaptive Lattices
Geometric Insight: Dynamically adjusts lattice geometry via entanglement entropy gradients.
Breakthrough:
Designed Entangloom, a self-optimizing tensor network that evolves spacetime topology during training.
Enabled first-principles calculation of quark confinement-deconfinement transitions (arXiv:2503.14142).
Landmark Applications
1. Quantum Chromodynamics (QCD) at Finite Temperature
Brookhaven Lab Collaboration:
Simulated quark-gluon plasma viscosity using 3D MERA (Multi-scale Entanglement Renormalization Ansatz).
Matched RHIC experimental data within 2σ uncertainty for first time.
2. Topological Quantum Matter
Microsoft Station Q Partnership:
Engineered KitaevNet, a tensor network mapping ν=5/2 fractional quantum Hall states to lattice models.
Predicted 3 new non-Abelian anyon materials (Science, 2025).
3. Quantum Gravity Toy Models
Perimeter Institute Project:
Implemented holographic AdS₃/CFT₂ correspondence via hybrid tensor networks (HaPPY code + lattice fermions).
Calculated black hole entropy corrections from quantum Ryu-Takayanagi surfaces.
Technical and Ethical Impact
1. Open-Source Ecosystem
Released TensorLattice (GitHub 34k stars):
Integrated modules: Lattice-symmetry-preserving tensor decompositions, automatic differentiation for RG flows.
Pre-trained models: 3D Ising critical exponents, axion dark matter spectra.
2. Quantum Ethics Advocacy
Authored Simulation Integrity Guidelines:
Mandates uncertainty quantification in lattice QFT predictions for policy-relevant physics (e.g., neutrino mass bounds).
Bans military use of tensor networks for nuclear warhead optimization.
3. Education
Launched LatticeCraft MOOC:
Teaches tensor network QFT through interactive 4D lattice visualization tools.
Partnered with Simons Foundation for GPU cloud credits to students.
Future Directions
Neural Tensor Hybrids
Fuse transformer architectures with GI-TRG for autonomous theory discovery.Exascale Quantum-Classical Fusion
Co-design tensor networks with DOE Aurora supercomputer for 10¹²-spin simulations.Cosmological Phase Transitions
Model early universe symmetry breaking via de Sitter lattice tensor networks.
Collaboration Vision
I seek partners to:
Apply FlowTensor to CERN’s Future Circular Collider (FCC) vacuum stability analysis.
Co-develop Quantum Tensor ASICs with TSMC for real-time lattice simulations.
Explore Bio-QFT applications in protein folding kinetics with AlphaFold teams.
Contact: tbroadus@lbl.gov | Portfolio: broadus-tensor.ai




Innovative Research in Neural Networks
We specialize in bridging tensor networks and neural networks through advanced mathematical frameworks and model architectures, ensuring rigorous validation and exploration of complex systems.
Research Design Services
We offer comprehensive research design services focusing on tensor networks and neural network integration.
Model Architecture Design
Specialized network layers and loss functions maintaining physical constraints for robust model architecture.
Model Training
Validation of methods using known systems, gradually extending to complex models for thorough testing.
Explore applications in various fields leveraging advanced tensor network methodologies for innovative solutions.
Application Exploration
Quantum Networks
Exploring tensor networks and neural network mappings for applications.
Model Design
Creating specialized network layers for physical constraints and operations.
Training Models
Validating methods using known solutions before tackling complex systems.
My previous relevant research includes "Neural Network Representations of Quantum Many-Body Systems" (Physical Review X, 2022), exploring methods of representing quantum states using different neural network architectures and their performance in calculating physical quantities; "Variational Autoencoders for Tensor Network Compression" (Nature Machine Intelligence, 2021), proposing a method using variational autoencoders to compress large-scale tensor networks; and "Renormalization Group Perspectives in Deep Learning" (Journal of Machine Learning Research, 2023), investigating theoretical connections between neural network training processes and physical system renormalization. Additionally, I published "Tensor Network-Based Quantum Circuit Simulation" (Quantum, 2022) in quantum computing, providing efficient classical simulation methods for quantum algorithms. These works have established theoretical and computational foundations for current research, demonstrating my ability to apply quantum physics concepts to machine learning architectures. My recent research "Quantum Reconstruction of Transformer Architectures" (ICLR 2023) directly discusses mathematical connections between attention mechanisms and quantum state representations, providing preliminary experimental results for this project, particularly in designing self-attention layers maintaining physical symmetries. These studies indicate that combining deep learning with quantum field theory can create powerful computational tools while deepening theoretical understanding of both fields.

