Graduate Student Seminar

Location

University Center : 115

Date & Time

April 6, 2016, 11:00 am12:00 pm

Description

Session ChairNicolle Massarelli
DiscussantDr. Hye-Won Kang

Speaker 1: Joshua Hudson
Title
Data Assimilation for the Navier-Stokes Equations and the Magneto-Hydrodynamic Equations in 2D
Abstract
In the study of a differential equation, dY = F(Y), subject to certain boundary conditions, theory usually is developed to show that unique solutions will arise given an initial value Y(0) = Y_0, and that the solution will change in a continuous way with respect to changes in the initial value. However, in practice the initial value may not be known accurately enough to appeal to this continuity. Data assimilation is the attempt to compensate for this lower bound on the accuracy of the measured initial condition by taking measurements as time goes on and feeding them back into the differential equation.

There are various ways of doing in this, and in this talk we will discuss a few algorithms for the incompressible Navier-Stokes equations (NSE) and Magneto-Hydrodynamic equations (MHD), which are systems of partial differential equations that govern the movement of a fluid in the absence (NSE) or presence (MHD) of a magnetic field.

We will see that the in 2D, algorithms exist which give exponential convergence of the data assimilation solutions to the true value of the observable, including some that collect measurements on less variables than the total number of unknowns in the system.

Speaker 2: Teresa Lebair
Title
Asymptotic Performance Analysis of a Shape Constrained B-Spline Estimator in the Supremum Norm
Abstract
Shape constrained estimation has received substantial attention from researchers as numerous functions from a variety of applications adhere to shape constraints such as the monotone or convex shape constraints.  In this talk we consider a B-spline estimator whose (m-1)-st derivative is increasing, and study the asymptotic performance of this estimator (with respect to the supremum-norm) over a class of functions with the same shape constraint.  In particular, we first discuss a minimax lower bound on this estimator's performance.  Next, we examine a critical uniform Lipschitz property of this estimator.  Finally, we exploit this uniform Lipschitz property to bound the bias and stochastic error of this estimator.  We will see that this estimator achieves the optimal rate of convergence when m = 1, 2, 3, but fails to achieve this optimal rate when m is greater than three, due to the estimator bias.