Numerix, PRMIA and Microsoft recently co-hosted an industry thought leadership forum discussion at New York City’s Harvard Club. In front of a standing room-only audience, panelists explored the challenges surrounding CVA implementation from the regulatory, quantitative and technology perspectives.
The evening kicked-off with a thought-provoking presentation from Aletta Ely, Vice President, JP Morgan Chase, Investment Bank Credit Portfolio Management Group. Ms. Ely addressed how regulations such as Basel III and Dodd-Frank are driving CVA best practices to deliver a single source of the truth enterprise-wide. With aggregation challenges between CVA desks across all asset classes, time zone challenges for global customers and ubiquitous data challenges, Ms. Ely stressed the need for organizations to weigh the regulatory and business demands for cross-asset, real-time CVA against its practical implementation challenges.
The second panelist, Denny Yu, Product Manager of Risk at Numerix, delved deeper into CVA’s quantitative and computational challenges. He reiterated concerns surrounding the massive amounts of complex calculations associated with accurate CVA computation.
Denny Yu, Product Manager, Risk for Numerix speaks at NYC’s Harvard Club to industry leaders on CVA
“CVA changes the most fundamental assumptions in OTC derivates pricing,” he explained, adding that it requires both a portfolio and counterparty level view of trades.
Mr. Yu also outlined the pay out problems resulting from static CDS hedging, highlighting dynamic hedging as one partial solution to the problem. However, a combination of vanilla CDS, foreign-denominated CDS, and contingent CDS is coming more into vogue and could be a better way to hedge exposure, explained Mr. Yu, adding that we’ll have to take a “wait and see approach.” Read more about CVA hedging.
“When it comes to CVA, we have different versions of the truth, depending upon what part of the organization we are in,” explained the third panelist, David Cox, Microsoft, Director, U.S. Banking and Capital Markets. He emphasized the need to use data at a holistic and enterprise level.
“There is a massive amount of new data coming into the market beyond levels previously imagined,” observed Mr. Cox. “The big problem is that 90% of it has been created in the past two years alone.”
“We are facing both a big data and a compute problem,” he added. Traditional models no longer work, and the compute capacity of many institutions is completely inadequate.
“VaR can be a major computing challenge, even for vanilla instruments,” Mr. Cox explained. He cited the case of one specific UK insurance company that said it would actually take them ten years to go through the calculations they needed to address the most basic Solvency II requirements.
So, what can practitioners do to tackle these challenges and manage them effectively?
Mr. Cox cited the need for a fresh approach. “Bursting pricing calculations to the cloud may be one effective solution to manage the huge CVA compute challenge we are facing today,” he explained. “The cloud offers virtually unlimited computing power, at literally the touch of a button and can bridge the gap between the business and IT, without the cost of buying a new server.”
For more information, contact [email protected].