【推荐】RNN/LSTM时序预测

2017 年 9 月 8 日 机器学习研究会
【推荐】RNN/LSTM时序预测


点击上方 “机器学习研究会”可以订阅
摘要
 

转自:爱可可-爱生活

The Statsbot team has already published the article about using time series analysis for anomaly detection. Today, we’d like to discuss time series prediction with a long short-term memory model (LSTMs). We asked a data scientist, Neelabh Pant, to tell you about his experience of forecasting exchange rates using recurrent neural networks.

As an Indian guy living in the US, I have a constant flow of money from home to me and vice versa. If the USD is stronger in the market, then the Indian rupee (INR) goes down, hence, a person from India buys a dollar for more rupees. If the dollar is weaker, you spend less rupees to buy the same dollar.

If one can predict how much a dollar will cost tomorrow, then this can guide one’s decision making and can be very important in minimizing risks and maximizing returns. Looking at the strengths of a neural network, especially a recurrent neural network, I came up with the idea of predicting the exchange rate between the USD and the INR.

There are a lot of methods of forecasting exchange rates such as:

  • Purchasing Power Parity (PPP), which takes the inflation into account and calculates inflation differential.

  • Relative Economic Strength Approach, which considers the economic growth of countries to predict the direction of exchange rates.

  • Econometric model is another common technique used to forecast the exchange rates which is customizable according to the factors or attributes the forecaster thinks are important. There could be features like interest rate differential between two different countries, GDP growth rates, income growth rates, etc.

  • Time series model is purely dependent on the idea that past behavior and price patterns can be used to predict future price behavior.

In this article, we’ll tell you how to predict the future exchange rate behavior using time series analysis and by making use of machine learning with time series.

Sequence problems

Let us begin by talking about sequence problems. The simplest machine learning problem involving a sequence is a one to one problem.

One to One

In this case, we have one data input or tensor to the model and the model generates a prediction with the given input. Linear regression, classification, and even image classification with convolutional network fall into this category. We can extend this formulation to allow for the model to make use of the pass values of the input and the output.

It is known as the one to many problem. The one to many problem starts like the one to one problem where we have an input to the model and the model generates one output. However, the output of the model is now fed back to the model as a new input. The model now can generate a new output and we can continue like this indefinitely. You can now see why these are known as recurrent neural networks.


链接(需翻墙):

https://blog.statsbot.co/time-series-prediction-using-recurrent-neural-networks-lstms-807fa6ca7f


原文链接:

https://m.weibo.cn/1402400261/4149583942871041

“完整内容”请点击【阅读原文】
↓↓↓
登录查看更多
24

相关内容

Modeling multivariate time series has long been a subject that has attracted researchers from a diverse range of fields including economics, finance, and traffic. A basic assumption behind multivariate time series forecasting is that its variables depend on one another but, upon looking closely, it is fair to say that existing methods fail to fully exploit latent spatial dependencies between pairs of variables. In recent years, meanwhile, graph neural networks (GNNs) have shown high capability in handling relational dependencies. GNNs require well-defined graph structures for information propagation which means they cannot be applied directly for multivariate time series where the dependencies are not known in advance. In this paper, we propose a general graph neural network framework designed specifically for multivariate time series data. Our approach automatically extracts the uni-directed relations among variables through a graph learning module, into which external knowledge like variable attributes can be easily integrated. A novel mix-hop propagation layer and a dilated inception layer are further proposed to capture the spatial and temporal dependencies within the time series. The graph learning, graph convolution, and temporal convolution modules are jointly learned in an end-to-end framework. Experimental results show that our proposed model outperforms the state-of-the-art baseline methods on 3 of 4 benchmark datasets and achieves on-par performance with other approaches on two traffic datasets which provide extra structural information.

0
29
下载
预览

While deep convolutional neural networks (CNNs) have shown a great success in single-label image classification, it is important to note that real world images generally contain multiple labels, which could correspond to different objects, scenes, actions and attributes in an image. Traditional approaches to multi-label image classification learn independent classifiers for each category and employ ranking or thresholding on the classification results. These techniques, although working well, fail to explicitly exploit the label dependencies in an image. In this paper, we utilize recurrent neural networks (RNNs) to address this problem. Combined with CNNs, the proposed CNN-RNN framework learns a joint image-label embedding to characterize the semantic label dependency as well as the image-label relevance, and it can be trained end-to-end from scratch to integrate both information in a unified framework. Experimental results on public benchmark datasets demonstrate that the proposed architecture achieves better performance than the state-of-the-art multi-label classification model

0
7
下载
预览
小贴士
相关资讯
利用动态深度学习预测金融时间序列基于Python
量化投资与机器学习
15+阅读 · 2018年10月30日
LibRec 精选:基于LSTM的序列推荐实现(PyTorch)
LibRec智能推荐
44+阅读 · 2018年8月27日
【推荐】用TensorFlow实现LSTM社交对话股市情感分析
机器学习研究会
10+阅读 · 2018年1月14日
【推荐】RNN最新研究进展综述
机器学习研究会
22+阅读 · 2018年1月6日
【推荐】自然语言处理(NLP)指南
机器学习研究会
33+阅读 · 2017年11月17日
【推荐】YOLO实时目标检测(6fps)
机器学习研究会
17+阅读 · 2017年11月5日
【推荐】MXNet深度情感分析实战
机器学习研究会
16+阅读 · 2017年10月4日
【推荐】用Tensorflow理解LSTM
机器学习研究会
34+阅读 · 2017年9月11日
【推荐】TensorFlow手把手CNN实践指南
机器学习研究会
5+阅读 · 2017年8月17日
【推荐】(Keras)LSTM多元时序预测教程
机器学习研究会
23+阅读 · 2017年8月14日
相关VIP内容
相关论文
Zonghan Wu,Shirui Pan,Guodong Long,Jing Jiang,Xiaojun Chang,Chengqi Zhang
29+阅读 · 2020年5月24日
Neural Image Captioning
Elaina Tan,Lakshay Sharma
4+阅读 · 2019年7月2日
Senmao Wang,Pan Zhou,Wei Chen,Jia Jia,Lei Xie
4+阅读 · 2019年4月23日
Sneha Chaudhari,Gungor Polatkan,Rohan Ramanath,Varun Mithal
12+阅读 · 2019年4月5日
A Memory-Network Based Solution for Multivariate Time-Series Forecasting
Yen-Yu Chang,Fan-Yun Sun,Yueh-Hua Wu,Shou-De Lin
11+阅读 · 2018年9月6日
Ethem F. Can,Aysu Ezen-Can,Fazli Can
11+阅读 · 2018年6月8日
Yingqi Qu,Jie Liu,Liangyi Kang,Qinfeng Shi,Dan Ye
11+阅读 · 2018年5月27日
Wei Xue,Tao Li
12+阅读 · 2018年5月18日
Wei Zhao,Benyou Wang,Jianbo Ye,Min Yang,Zhou Zhao,Xiaojun Chen
8+阅读 · 2018年5月2日
Jiang Wang,Yi Yang,Junhua Mao,Zhiheng Huang,Chang Huang,Wei Xu
7+阅读 · 2016年4月15日
Top