% This file was adapted from ICLR2022_conference.tex example provided for the ICLR conference \documentclass{article} % For LaTeX2e \usepackage{conference,times} \usepackage{easyReview} \usepackage{algorithm} \usepackage{algorithmic} % Optional math commands from https://github.com/goodfeli/dlbook_notation. \input{math_commands.tex} % Please leave these options as they are \usepackage{hyperref} \hypersetup{ colorlinks=true, linkcolor=red, filecolor=magenta, urlcolor=blue, citecolor=purple, pdftitle={Navier--Stokes Regularity via Critical Norm Tracking}, pdfpagemode=FullScreen, } \title{Navier--Stokes Regularity via Critical Norm Tracking} \author{Anonymous Authors \\ Affiliation \\ Address \\ \texttt{email}} \begin{document} \maketitle \begin{abstract} We study 3D incompressible Navier--Stokes flow on a periodic box and design a diagnostic suite that links classical Prodi--Serrin mixed norms, weak-$L^p$ (Lorentz) proxies, scaling-invariant weighted criteria, and sparseness-based geometry. The goal is to evaluate how these regularity indicators behave in direct numerical simulation (DNS) across Reynolds-number sweeps and to quantify numerical sensitivities due to de-aliasing. The study matters for both mathematical regularity theory and practical turbulence modeling because it formalizes computable indicators that can be tracked in simulations and compared to validated datasets. We provide a formal problem setting, an algorithmic pipeline, and a hypothesis-driven evaluation protocol, with explicit acceptance criteria and uncertainty procedures. Empirical results are not yet available because the required scientific Python stack and external dataset access are currently unavailable in the execution environment; we therefore report the full experimental design and the evidence plan that will be used once runs are enabled. \end{abstract} \section{Introduction} The Navier--Stokes regularity problem connects fundamental analysis with predictive modeling of turbulent flows. In practice, DNS and validated databases offer a route to empirically probe critical-norm growth, but bridging the theory--diagnostic gap is nontrivial. Classical Prodi--Serrin mixed-norm criteria provide a baseline for regularity, while weak-$L^p$ and log-improved variants suggest that more permissive integrability can still imply smoothness \citep{tran_yu_2016,bosia_pata_robinson_2014,bjorland_vasseur_2009}. Parallel geometric approaches quantify sparseness of intense structures and point to asymptotic criticality, supported by numerical evidence in isotropic turbulence \citep{grujic_xu_2024,rafner_et_al_2021}. The present work integrates these lines into a coherent computational protocol that can be run on a periodic box and compared with benchmark data from the Johns Hopkins turbulence database (JHTDB) \citep{li_et_al_2008,jhtdb_dataset_2024}. Beyond mathematical interest, diagnostics that track near-critical growth have cross-domain relevance: they inform numerical reliability in spectral simulations, support uncertainty-aware turbulence modeling, and provide interpretable indicators for extreme-event forecasting. We focus on the $2\pi$-periodic setting with 2/3 de-aliasing and a constrained compute budget, reflecting realistic limitations of small-scale experimental studies and open-source pipelines \citep{patterson_orszag_1971,orszag_1971}. \paragraph{Contributions.} \begin{itemize} \item We formalize a diagnostic suite that unifies classical mixed norms, weak-$L^p$ (Lorentz) proxies, scaling-invariant weighted criteria, and sparseness metrics in a single evaluation plan. \item We specify a hypothesis-driven evaluation protocol with Reynolds-number sweeps, de-aliasing variants, and explicit acceptance criteria for each diagnostic, enabling reproducible falsification. \item We introduce an aliasing error budget analysis to quantify how high-wavenumber filtering affects near-critical norm spikes. \item We provide a reproducibility plan with seeds, sweeps, compute limits, and uncertainty procedures tailored to periodic DNS and benchmark validation. \end{itemize} \section{Background and Related Work} Classical Prodi--Serrin criteria guarantee regularity when the velocity lies in $L^p_t L^q_x$ with $2/p + 3/q \le 1$ and $q>3$, and they remain a standard baseline for mixed-norm tracking \citep{tran_yu_2016}. Weak-$L^p$ extensions replace spatial $L^q$ norms with Lorentz $L^{r,\infty}$ norms, widening the admissible class while retaining regularity guarantees \citep{bosia_pata_robinson_2014}. Log-improved criteria further relax temporal integrability, suggesting that classical norms may be overly conservative when used as numerical diagnostics \citep{bjorland_vasseur_2009}. Geometric regularity approaches shift attention from integrability to the sparseness of super-level sets of vorticity or higher derivatives, yielding a framework in which the scaling gap closes asymptotically with derivative order \citep{grujic_xu_2024}. Numerical evidence in Kida vortex bursts and isotropic turbulence indicates that sparseness exponents can exceed critical thresholds in practice \citep{rafner_et_al_2021}. However, comparisons between sparseness indicators and mixed-norm diagnostics are limited, and their relative stability under numerical discretization is underexplored. On the computational side, pseudo-spectral DNS with 2/3 de-aliasing is a standard technique to control aliasing errors, and additional high-wavenumber filtering has been proposed to further suppress numerical contamination \citep{patterson_orszag_1971,orszag_1971}. The JHTDB infrastructure supplies validated isotropic turbulence data and web-service access to derivatives and spectra, making it a natural benchmark for diagnostic validation \citep{li_et_al_2008,jhtdb_dataset_2024}. The gap motivating this work is a unified, DNS-computable diagnostic suite that connects these mathematical criteria with realistic numerical constraints and explicit evidence thresholds. \section{Problem Setting} Let $\mathbb{T}^3 = [0,2\pi)^3$ be the periodic domain. We study the incompressible Navier--Stokes equations \begin{equation} \partial_t u + (u \cdot \nabla)u = -\nabla p + \nu \Delta u, \quad \nabla \cdot u = 0, \end{equation} with initial data $u_0 \in H^1(\mathbb{T}^3)$, viscosity $\nu>0$, and Reynolds number $\mathrm{Re} = UL/\nu$ with characteristic velocity $U$ and length $L = 2\pi$. The Leray--Hopf energy inequality is \begin{equation} \|u(t)\|_2^2 + 2\nu \int_0^t \|\nabla u(\tau)\|_2^2 \, d\tau \le \|u_0\|_2^2, \end{equation} which anchors our diagnostic normalization and burst detection \citep{tran_yu_2016}. For Prodi--Serrin pairs $(p,q)$ satisfying $2/p + 3/q = 1$ and $q>3$, we define the mixed norm \begin{equation} \|u\|_{L^p_t L^q_x} = \left( \int_0^T \left( \int_{\mathbb{T}^3} |u(x,t)|^q \, dx \right)^{p/q} dt \right)^{1/p}. \end{equation} Weak-$L^p$ diagnostics replace the spatial $L^q$ norm with a Lorentz norm \begin{equation} \|u\|_{L^s_t L^{r,\infty}_x} = \left( \int_0^T \left( \sup_{\alpha>0} \alpha\, \mu\big(\{x: |u(x,t)|>\alpha\}\big)^{1/r} \right)^s dt \right)^{1/s}, \end{equation} where $3/r + 2/s = 1$ and $\mu$ denotes Lebesgue measure \citep{bosia_pata_robinson_2014,bjorland_vasseur_2009}. A scaling-invariant weighted diagnostic is defined by \begin{equation} W(T) = \int_0^T \frac{\|u(\cdot,t)\|_{L^q}^{p}}{\big(1 + S(u(t))\big)^{\gamma}} \, dt, \end{equation} where $S(u)$ is a scaling-invariant norm such as $\|u\|_{L^3}$ or $\|u\|_{\dot H^{1/2}}$ and $\gamma>0$ \citep{tran_yu_2016}. For sparseness, define the super-level set of derivatives $D^k u$ by \begin{equation} S_k(\lambda,t) = \{x \in \mathbb{T}^3 : |D^k u(x,t)| > \lambda\}, \quad \mu(S_k(\lambda,t)) \propto \lambda^{-\alpha_k}, \end{equation} and compare the estimated exponent $\alpha_k$ to the critical scaling $\alpha_k^\ast$ implied by the Navier--Stokes scaling $u_\lambda(x,t)=\lambda u(\lambda x,\lambda^2 t)$ \citep{grujic_xu_2024,rafner_et_al_2021}. \section{Methods} \subsection{Numerical setup and validation context} We target pseudo-spectral DNS on $\mathbb{T}^3$ with 2/3 de-aliasing and optional high-wavenumber filtering. This choice is grounded in classical aliasing-control analysis \citep{patterson_orszag_1971,orszag_1971} and enables direct comparisons to JHTDB isotropic turbulence spectra and derivative statistics when accessible \citep{li_et_al_2008,jhtdb_dataset_2024}. Reynolds numbers are swept over $\{200,400,800,1600\}$, with resolutions chosen to respect the compute budget while resolving intermittency at a coarse-grained level. \subsection{Diagnostic components and motivation} \textbf{Classical mixed norms.} Prodi--Serrin norms provide the baseline critical-line diagnostic and are the reference for all comparative metrics \citep{tran_yu_2016}. \textbf{Weak-$L^p$ (Lorentz) proxy.} Weak-$L^p$ criteria are theoretically more permissive than Lebesgue-based criteria, motivating a proxy that estimates Lorentz norms from empirical order statistics of $|u|$ \citep{bosia_pata_robinson_2014,bjorland_vasseur_2009}. \textbf{Scaling-invariant weighting.} Weighted diagnostics based on scaling-invariant norms are expected to damp transient spikes while preserving sensitivity to sustained growth \citep{tran_yu_2016}. \textbf{Sparseness metrics.} Super-level set geometry connects directly to asymptotic criticality and provides a complementary, geometric signal for burst dynamics \citep{grujic_xu_2024,rafner_et_al_2021}. \textbf{Aliasing error budget.} Comparing baseline 2/3 de-aliasing to filtered variants quantifies numerical contamination of near-critical spikes, grounding the diagnostic interpretation in numerical analysis \citep{patterson_orszag_1971,orszag_1971}. \subsection{Pipeline overview} The pipeline consists of (i) DNS generation with controlled de-aliasing, (ii) diagnostic extraction for mixed norms, weak-$L^p$ proxies, and sparseness metrics, (iii) aggregation across Reynolds numbers and seeds with uncertainty estimation, and (iv) validation against benchmark spectra or gradient statistics when dataset access is available \citep{li_et_al_2008,jhtdb_dataset_2024}. Each module is designed to be replaceable: alternative DNS solvers can feed the same diagnostic layer, and validation can use either JHTDB or synthetic proxies when external access is blocked. \begin{algorithm} \caption{Critical-norm diagnostic pipeline for periodic DNS.} \label{alg:pipeline} \begin{algorithmic} \STATE Select $(p,q)$ pairs on the Prodi--Serrin line and set sweep parameters. \FOR{each $\mathrm{Re}$, resolution, and de-aliasing variant} \STATE Run pseudo-spectral DNS to produce snapshots of $u(x,t)$. \STATE Compute mixed norms, weak-$L^p$ proxy statistics, and sparseness exponents. \STATE Detect burst windows via enstrophy or spectral deviations. \STATE Aggregate metrics across seeds and compute uncertainty estimates. \ENDFOR \STATE Compare metrics to acceptance criteria in Table~\ref{tab:hypotheses}. \end{algorithmic} \end{algorithm} \section{Evaluation Protocol and Planned Evidence} Table~\ref{tab:hypotheses} summarizes the four hypotheses and the evidence required to validate each claim. The metrics are chosen to directly test whether weak-$L^p$ proxies diverge from classical norms (H1), whether sparseness exponents exceed criticality (H2), whether weighting stabilizes diagnostics (H3), and whether aliasing control bounds spurious spikes (H4). By defining quantitative acceptance criteria, the table serves as the evidence specification for each hypothesis and will be populated once DNS runs are available. \begin{table}[t] \caption{Hypotheses, diagnostics, and acceptance criteria defining the evidence required to validate each claim. Meeting the criteria provides support for the corresponding hypothesis, while violations indicate falsification or sensitivity to numerical settings.} \label{tab:hypotheses} \centering \small \renewcommand{\arraystretch}{1.1} \setlength{\tabcolsep}{4pt} \resizebox{\linewidth}{!}{ \begin{tabular}{p{1.6cm} p{6.6cm} p{6.6cm}} \textbf{Hypothesis} & \textbf{Diagnostic evidence} & \textbf{Acceptance criteria (planned)} \\ \hline H1 Weak-$L^p$ divergence & Time series of $\|u\|_{L^p_t L^q_x}$ vs Lorentz proxy; peak-to-mean ratios; cross-correlation and burst alignment. & Lorentz proxy peak-to-mean ratio $\le 0.9\times$ classical in $\ge 3/4$ Reynolds numbers and correlation drops below $0.9$ in burst windows for $\ge 2$ seeds. \\ H2 Sparseness criticality & Log--log fits of $\mu(S_k(\lambda,t))$ vs $\lambda$ for $k=1,2$; $\alpha_k$ vs $\alpha_k^\ast$; alignment with enstrophy peaks. & Fit $R^2 \ge 0.9$ in $\ge 70\%$ of windows and $\alpha_k > \alpha_k^\ast$ in $\ge 3/4$ Reynolds numbers for $k=1,2$. \\ H3 Weighted stability & Weighted diagnostic $W(t)$ vs unweighted mixed norm; variance, lead time, and false positive rate. & Variance $\le 0.85\times$ unweighted in $\ge 3/4$ Reynolds numbers and lead time improves $\ge 10\%$ without $>5\%$ FPR increase. \\ H4 Aliasing error budget & Energy spectra and critical norms under baseline and filtered de-aliasing; spike reduction metrics. & Filtering reduces spike amplitude $\ge 15\%$ with energy decay change $\le 5\%$ in $\ge 3/4$ Reynolds numbers. \\ \end{tabular}} \end{table} Table~\ref{tab:protocol} records the sweep parameters, seeds, and datasets that will be used to populate the evidence in Table~\ref{tab:hypotheses}. Together, these tables define the experimental design and establish how each hypothesis will be tested and compared across Reynolds numbers and numerical variants. \begin{table}[t] \caption{Simulation protocol for periodic DNS, including Reynolds numbers, resolutions, seeds, and diagnostic parameter sweeps. These settings specify the experimental conditions used to populate the evidence criteria in Table~\ref{tab:hypotheses}.} \label{tab:protocol} \centering \small \renewcommand{\arraystretch}{1.1} \setlength{\tabcolsep}{4pt} \resizebox{\linewidth}{!}{ \begin{tabular}{p{3.2cm} p{11.6cm}} \textbf{Component} & \textbf{Specification} \\ \hline Domain and solver & $\mathbb{T}^3$ with 2/3 de-aliasing; optional high-$k$ filter variants; pseudo-spectral DNS \citep{patterson_orszag_1971,orszag_1971}. \\ Reynolds sweep & $\mathrm{Re} \in \{200, 400, 800, 1600\}$. \\ Resolutions & $64^3$, $96^3$, and $128^3$ grids (as budget permits). \\ Seeds & $\{11, 23, 37, 53\}$ with fixed initialization protocol. \\ Mixed-norm pairs & $(p,q) \in \{(4,6), (8,4), (10,10/3)\}$ on the critical line. \\ Weak-$L^p$ proxy & Lorentz exponents $(r,s) \in \{(4,4), (6,6)\}$ and sampling stride $\{1,2,4\}$. \\ Sparseness fits & Threshold grids on logspace ranges; derivative orders $k \in \{1,2\}$; fit windows $\{0.2, 0.3\}$. \\ Validation datasets & Synthetic DNS and JHTDB isotropic turbulence when accessible \citep{li_et_al_2008,jhtdb_dataset_2024}. \\ \end{tabular}} \end{table} \section{Results Status} At present, the results section is restricted to the evidence plan described in Tables~\ref{tab:hypotheses} and \ref{tab:protocol}. Full DNS runs, figures, and quantitative tables have not yet been generated because the scientific Python stack and external data access required for the diagnostic pipeline are unavailable in the current execution environment. As a consequence, no empirical confirmation or refutation of H1--H4 is reported here. The planned analysis will populate the metrics in Table~\ref{tab:hypotheses} and will compare the protocol settings of Table~\ref{tab:protocol} against the acceptance criteria once execution is enabled. \section{Limitations and Future Work} The principal limitation is a data and tooling gap: DNS-derived diagnostics and JHTDB validations cannot currently be executed, and only the experimental protocol is available. This gap limits conclusions to methodological design and precludes empirical validation of the hypotheses in Table~\ref{tab:hypotheses}. It also prevents quantifying uncertainty intervals and numerical sensitivity in a data-driven manner. \subsection{Future Work} Once the required dependencies and dataset access are available, the immediate follow-up is to run the full Reynolds-number sweep, populate Tables~\ref{tab:hypotheses} and \ref{tab:protocol} with measured metrics, and produce the corresponding figures. Additional experiments will refine sampling strides, sparseness fit windows, and filter strengths to assess robustness, and uncertainty will be quantified using repeated seeds and bootstrap confidence intervals. \section{Conclusion} We present a unified diagnostic framework for critical norm tracking in periodic DNS that integrates classical mixed norms, weak-$L^p$ proxies, scaling-invariant weighting, and sparseness geometry. The evaluation protocol and acceptance criteria in Tables~\ref{tab:hypotheses} and \ref{tab:protocol} define a reproducible evidence plan that will test whether these diagnostics diverge in turbulent bursts and whether numerical aliasing contaminates near-critical spikes. While empirical results are pending, the framework establishes a clear path for bridging theoretical regularity criteria with operational turbulence diagnostics. \bibliographystyle{conference} \bibliography{references} \appendix \section{Reproducibility and Implementation Details} All runs are planned on a single machine with a total compute budget of at most four CPU hours. The experimental sweep uses Reynolds numbers $\{200,400,800,1600\}$, grid resolutions $64^3$ to $128^3$, and seeds $\{11,23,37,53\}$. De-aliasing follows the 2/3 rule with optional exponential or sharp high-$k$ filtering variants. Diagnostics include mixed-norm pairs $(p,q) \in \{(4,6),(8,4),(10,10/3)\}$, weak-$L^p$ proxy parameters $(r,s) \in \{(4,4),(6,6)\}$, and sparseness thresholds on logspace grids with fit-window fractions in $\{0.2,0.3\}$. Burst detection uses enstrophy growth or spectral deviation thresholds tuned per seed to reduce false positives. Uncertainty will be estimated via repeated-seed aggregation with nonparametric bootstrap confidence intervals for peak-to-mean ratios, correlation metrics, and sparseness exponents. Approximations include short time horizons and coarse grids to respect the budget; these approximations will be reported alongside all metrics to contextualize sensitivity. Validation against JHTDB will use the published isotropic turbulence dataset and its DOI record when access is enabled \citep{li_et_al_2008,jhtdb_dataset_2024}. \end{document}