|Session Chair||Xinxuan Li|
Speaker 1: Hyekyung Park
- Robust Value-at-Risk (VaR) portfolio selection problem under the joint ellipsoidal uncertainty set in the presence of transactions costs
- The robust portfolio selection problem considers the worst case of return un- der uncertainty sets of parameters, such as mean return and covariance of return. Goldfarb and Iyengar defined the return of assets by a factor model and provide the ‘Separable’ uncertainty sets for mean return and covariance of factor returns. However the sets are too conservative and construct a non-diversified portfolio. To overcome the drawbacks, Lu defined the ‘Joint’ ellipsoidal uncertainty set for mean return and covariance of factor returns.
In this research we derive a robust portfolio under the ‘Joint’ ellipsoidal un- certainty set. The problem is to maximize the expected return on a portfolio while restricting loss to exceed an investor’s specific acceptable loss on a specified degree of confidence, called the robust Value-at-Risk (VaR) constraint problem. The constraint establishes an upper bound ε on the probability of losing a given percentage δ on the investment. The constraint under the uncertainty set is a non-convex function, so we use two reasonable estimations, which can be derived as semidefinite and second order cone constraints, so that the problem with the estimations can be easily solved. The computational results on real market data show why the estimations are reasonable, and these results are compared to the problem under the ‘Separable’ uncertainty sets.
Additionally we extend the robust VaR constraint problem under the ‘joint’ uncertainty set to the problem in the presence of transactions costs, which are expenses incurred when buying or selling stocks. The idea is from the multi-period portfolio management problem and uses the same notations. The problem is to maximize transactions costs-adjusted return with the VaR constraint under the ellipsoidal uncertainty set. The real market simulation examines the impact of transactions costs consideration in the model.
Speaker 2: Arayana Arsham
- Monte Carlo Parallel Processing: Reliability Analysis
- Monte Carlo (MC) simulation is the widely used tool for analysis of complex reliability systems. The performance measure of such systems is expressed by reliability metrics such as mean time to failure. Such simulation methods for assessing system reliability are often hindered by their requirement for considerable memory and time. Implementing MC reliability algorithms using parallel processing methods greatly enhances their efficiency and effectiveness. This work provides MC and parallel processing methods applied to realistic system modeling as well as results of implementing such algorithms, performed in R software.