【导读】深度学习框架TensorFlow 2.0 （preview版）可以用pip进行安装了。本文介绍安装TensorFlow 2.0 preview的方法，并介绍一个Github项目tf2_course，它包含了一些TensorFlow 2的练习和解决方案，以Jupyter Notebook的形式展现。
TensorFlow 2.0 Preview版本的安装
我们在Ubuntu上的Python 3.6环境中成功安装了TensorFlow 2.0 preview（暂时没有在Windows或Python3.5环境上成功安装）。首先需要用Miniconda或Anaconda创建一个名为python36的Python3.6环境：
conda create -n python36 python=3.6
source activate python36
然后使用pip直接安装TensorFlow 2.0 Preview：
pip install tf-nightly-gpu-2.0-preview
ImportError: libcublas.so.10.0: cannot open shared object file: No such file or directory
$ cd $HOME # or any other development directory you prefer
$ git clone https://github.com/ageron/tf2_course.git
$ cd tf2_course
Neural Nets with Keras
Low-Level TensorFlow API
Loading and Preprocessing Data
Nonnegative matrix factorization (NMF) has become a workhorse for signal and data analytics, triggered by its model parsimony and interpretability. Perhaps a bit surprisingly, the understanding to its model identifiability---the major reason behind the interpretability in many applications such as topic mining and hyperspectral imaging---had been rather limited until recent years. Beginning from the 2010s, the identifiability research of NMF has progressed considerably: Many interesting and important results have been discovered by the signal processing (SP) and machine learning (ML) communities. NMF identifiability has a great impact on many aspects in practice, such as ill-posed formulation avoidance and performance-guaranteed algorithm design. On the other hand, there is no tutorial paper that introduces NMF from an identifiability viewpoint. In this paper, we aim at filling this gap by offering a comprehensive and deep tutorial on model identifiability of NMF as well as the connections to algorithms and applications. This tutorial will help researchers and graduate students grasp the essence and insights of NMF, thereby avoiding typical `pitfalls' that are often times due to unidentifiable NMF formulations. This paper will also help practitioners pick/design suitable factorization tools for their own problems.
In this paper, we propose a novel sequence-aware recommendation model. Our model utilizes self-attention mechanism to infer the item-item relationship from user's historical interactions. With self-attention, it is able to estimate the relative weights of each item in user interaction trajectories to learn better representations for user's transient interests. The model is finally trained in a metric learning framework, taking both short-term and long-term intentions into consideration. Experiments on a wide range of datasets on different domains demonstrate that our approach outperforms the state-of-the-art by a wide margin.