32 private links
Snoek 2012 paper
code here: http://www.cs.toronto.edu/˜jasper/software.html
TPA: https://github.com/jaberg/hyperopt/wiki
DeepMind's paper on bayesian optimization
overview of bayesian optimization by the author of Spearmint
analysis suggests that the Earth System may be approaching a planetary threshold that could lock in a continuing rapid pathway toward much hotter conditions—Hothouse Earth. This pathway would be propelled by strong, intrinsic, biogeophysical feedbacks difficult to influence by human actions, a pathway that could not be reversed, steered, or substantially slowed.
Where such a threshold might be is uncertain, but it could be only decades ahead at a temperature rise of ∼2.0 °C above preindustrial, and thus, it could be within the range of the Paris Accord temperature targets.
The impacts of a Hothouse Earth pathway on human societies would likely be massive, sometimes abrupt, and undoubtedly disruptive.
2 °C warming would translate to 1,119 (748–1,392) or 1,327 (1,123–1,516) cities committed under the baseline or triggered assumptions, respectively, and would affect land that is home to 19.0 (11.6–25.0) or 23.0 (16.8–28.1) million people today, respectively. Warming of 4 °C would increase central estimates to more than 1,745 cities and 30 million people under either assumption.
discusses the organization of the repurchaseagreement (repo) market in Canada. We define the repo contract, the market infrastructures that support repo trading and the composition of the market participants. We also describe repo trading practices in Canada, risks in the repo market and repo regulation. A repo is a financial contract that resembles a collateralized loan. It is used to support the funding needs of financial institutions and to procure on a temporary basis specific securities. The Canadian repo market is primarily composed of large banks and large investment institutions such as pension funds. A unique feature of the Canadian market is that Canadian investment institutions are net borrowers of cash via repo. Repo can transmit risks in the financial system because it can create levered interconnections among participants. Risks in the Canadian repo market are relatively smaller than in other jurisdictions.
[1702.03275] Batch Renormalization: Towards Reducing Minibatch Dependence in Batch-Normalized Models
Batch Normalization is quite effective at accelerating and improving the training of deep models. However, its effectiveness diminishes when the training minibatches are small, or do not consist of independent samples. We hypothesize that this is due to the dependence of model layer inputs on all the examples in the minibatch, and different activations being produced between training and inference. We propose Batch Renormalization, a simple and effective extension to ensure that the training and inference models generate the same outputs that depend on individual examples rather than the entire minibatch. Models trained with Batch Renormalization perform substantially better than batchnorm when training with small or non-i.i.d. minibatches. At the same time, Batch Renormalization retains the benefits of batchnorm such as insensitivity to initialization and training efficiency.
This article proposes a hierarchical clustering-based asset allocation method, which uses graph theory and machine learning techniques. Hierarchical clustering refers to the formation of a recursive clustering, suggested by the data, not defined a priori. Several hierarchical clustering methods are presented and tested. Once the assets are hierarchically clustered, the authors compute a simple and efficient capital allocation within and across clusters of assets, so that many correlated assets receive the same total allocation as a single uncorrelated one. The out-of-sample performances of hierarchical clustering-based portfolios and more traditional risk-based portfolios are evaluated across three disparate datasets, which differ in term of the number of assets and the assets’ composition. To avoid data snooping, the authors assess the comparison of profit measures using the bootstrap-based model confidence set procedure. Their empirical results indicate that hierarchical clustering-based portfolios are robust and truly diversified and achieve statistically better risk-adjusted performances than commonly used portfolio optimization techniques.
In this article, the author revisits his seminal paper on tactical asset allocation published over 10 years ago in The Journal of Wealth Management. How well has this market strategy—a simple quantitative method that improves the risk-adjusted returns across various asset classes—held up since its 2007 publication? Overall, the author finds that the model has performed well in real time, achieving equity-like returns with bond-like volatility and drawdowns. The author also examines the effects of departures from the original system, including adding more asset classes, introducing various portfolio allocations, and implementing alternative cash management strategies.
Global financial markets have been experiencing low-risk anomalies for decades. In a low-risk anomaly, low-risk stocks offer better returns than high-risk stocks, violating the fundamental tenets of many financial theories. We developed an optimal portfolio strategy that exploits low-risk anomalies in the Black–Litterman framework. Our view is that low-risk assets will outperform high-risk assets. Forecasting volatility is the most important factor in constructing a view portfolio and in determining portfolio performance. To increase the predictive power regarding volatility, the best-performing prediction model should be selected.
We compared predictive power between three state-of-the-art machine-learning prediction models (GPR, SVR, and ANN) and the GARCH and historical volatilities. SVR and ANN showed better predictive power than GARCH in all error metrics. ANN was chosen as the best model because it showed higher predictive stability than SVR. We predicted the volatility levels of each asset by the chosen ANN model and used these to construct a Black–Litterman portfolio in order to exploit the low-risk anomaly. We compared the performance of the low-risk Black–Litterman portfolio with the market portfolio and the CAPM-based market equilibrium portfolio that excludes the low-risk view in the Black–Litterman framework.
Reflecting the low-risk view was found to improve the performance of the market equilibrium portfolio, which dominated the market portfolio. The equilibrium portfolio showed a lower Sharpe ratio than the market portfolio and a negative alpha. However, reflecting the low-risk view in the portfolio greatly improved the Sharpe ratio and the alpha. In addition, the estimation error of the expected returns and covariance matrix with the low-risk view decreased as τ decreased, contributing to the improvement of the portfolio's performance.
Since low-risk anomalies are global phenomena, the market for volatility strategies is expected to be enormous. We can also combine low-risk anomalies in each market to form an optimal portfolio.
Professor Hand argued that there are reasons why pattern recognition may not do better than the traditional linear regression algorithms—namely, nonstationarity, small signal/noise, and overfitting. Up to this point, we have considered many interesting ideas, some of which are backed by economic intuition, but we have yet to see a significant amount of empirical evidence. Future research should focus on actionable ideas regarding machine learning and big data in the entire spectrum of the investment process—that is, alpha, beta, risk management, and execution and trading.
We use data from 1999 to 2015 for 45 different assets. We compare risk-return characteristics in an out-of-sample framework of all network-based asset allocation strategies with their respective benchmark models. Within our sample, we show that utilizing information on the interconnectedness of various asset returns given the topological structure of a network improves the risk-return characteristics of standard benchmark portfolios. It is quite clear that the constructed network takes into account complex relationships between assets beyond those measured by correlations.
We propose the so-called ρ-dependent strategy and test its performances against the extremely simple yet effective 1/N naïve rule and two Markowitz-related policies. Our out-of-sample results show that the ρ-dependent strategy tends to present significant higher portfolio Sharpe ratios and lower portfolio variance relative to these well-known benchmarks. Additionally, this enhanced performance is not explained by large exposures to traditional risk factors as indicated by the reported positive and statistically significant Carhart's alphas. More importantly, our results are robust to several portfolio configurations, time periods and markets even after accounting for transaction costs.
The MVO algorithm treats all variables as interrelated, assuming a complete cluster. In other words, traditional asset allocation does not recognize the complexity immerse in the data. This work presents some novel, robust and flexible methods with visual interpretations to construct risk-adjusted portfolios. Clustering methods showed a better trade-off between return and risk than MVO algorithm. The empirical results indicate that hierarchical algorithms have a better performance when building diversified portfolios measured by the Omega ratio. One of the most important results is the stable behavior of clustering-based portfolios addressing a special issue in financial markets, the volatility.