|Session Chair:||Ahmad Mousavi|
|Discussant:||Dr. Jinglai Shen|
Speaker 1: Eswar Kammara
- Proximal Algorithms
- This talk is about a class of algorithms, called proximal algorithms, for solving convex optimization problems. Much like Newton's method is a standard tool for solving unconstrained smooth minimization problems of modest size, proximal algorithms can be viewed as an analogous tool for nonsmooth, constrained, large-scale, or distributed versions of these problems. They are very generally applicable, but they turn out to be especially well-suited to problems of recent and widespread interest involving large or high-dimensional datasets.
In this talk, we will give an introduction about proximal operators, discuss proximal gradient method, accelerated proximal gradient method and ADMM and compare their performance on LASSO problem.
Speaker 2: Nadeesri Wijekoon
- Conditionally Specified Joint densities in Exponential Family of Distributions
- In many real-life situations, it is easier and intuitive to specify and visualize a joint distribution in terms of conditional densities rather than using the joint directly. This structure can be found in many applications such as self-proxy data, Finance, Synthetic data in survey sampling, etc.
In this talk, we will explore the theoretical aspect of conditionally specified joint distributions. Particularly, we will consider the most general class of joint distributions such that both sets of conditionals belong to the exponential family of densities.