Beyond ATE: Multi-Criteria Design for A/B Testing
arXiv:2509.05864v2 Announce Type: replace-cross
Abstract: In the era of large-scale AI deployment and high-stakes clinical trials, adaptive experimentation faces a “trilemma” of conflicting objectives: minimizing cumulative regret (welfare loss during the experiment), maximizing the estimation accuracy of heterogeneous treatment effects (CATE), and ensuring differential privacy (DP) for participants. Existing literature typically optimizes these metrics in isolation or under restrictive parametric assumptions. In this work, we study the multi-objective design of adaptive experiments in a general non-parametric setting. First, we rigorously characterize the instance-dependent Pareto frontier between cumulative regret and estimation error, revealing the fundamental statistical limits of dual-objective optimization. We propose ConSE, a sequential segmentation and elimination algorithm that adaptively discretizes the covariate space to achieve the Pareto-optimal frontier. Second, we introduce DP-ConSE, a privacy-preserving extension that satisfies Joint Differential Privacy. We demonstrate that privacy comes “for free” in our framework, incurring only asymptotically negligible costs to regret and estimation accuracy. Finally, we establish a robust link between experimental design and long-term utility: we prove that any policy derived from our Pareto-optimal algorithms minimizes post-experiment simple regret, regardless of the specific exploration-exploitation trade-off chosen during the trial. Our results provide a theoretical foundation for designing ethical, private, and efficient adaptive experiments in sensitive domains.