今天教大家服务器运行 jupyter notebook

跑深度学习这个高大厦,又很沙雕的东西

买一台服务器装下逼

虚拟环境

不建议 win10

mkvirtualenv -p /usr/bin/python3.6 deeplearnworkon deeplearnpip install tensorflowpip install jupyter

然后

vim ~/.jupyter/jupyter_notebook_config.py

改下

c.NotebookApp.ip = '0.0.0.0'c.NotebookApp.open_browser = Falsec.NotebookApp.port = 8888c.NotebookApp.password = ''

然后关关防火墙,让 ssh 通下

root@VM-0-5-ubuntu:~# firewall-cmd --staterunningroot@VM-0-5-ubuntu:~# systemctl stop firewalld.serviceroot@VM-0-5-ubuntu:~# systemctl disable firewalld.serviceSynchronizing state of firewalld.service with SysV service   with /lib/systemd/systemd-sysv-install.Executing: /lib/systemd/systemd-sysv-install disable firewalldRemoved /etc/systemd/system/dbus-org.fedoraproject.FirewallD1.service.

打开 jupyter notebook

打开浏览器,跑起来

也可以将服务器换成本地的,vm 建立一个隧道

这些配置简单

本地打开没问题

搞定了,我当然来复习 tfboys,虽说 tf 更到 2.0,不与 1.0 接融,但是 tf2.0 更简单了

官网 :http://www.tensorfly.cn/

TensorFlow 是一个用于人工智能的开源神器

有人说 tf 难?

难个毛线,凡是 py 都是 so easy

看看下面的手写 mnits 例子,不就是使用 keras 接口吗

# TensorFlowimport tensorflow as tfprint(tf.__version__)
2.0.0
# 加载手写数字集mnist = tf.keras.datasets.mnist(x_train, y_train), (x_test, y_test) = mnist.load_data()from matplotlib import pyplot as plt%matplotlib inlineplt.imshow(x_train[0])
# 缩放x_train, x_test = x_train / 255.0, x_test / 255.0
# 使用tf的接口kerasmodel = tf.keras.models.Sequential([    # (28, 28,1) ->(28, 28)    tf.keras. s.Flatten(input_shape=(28, 28)),    # 128 神经元个数    tf.keras. s.Dense(128, activation='relu'),    # 防止过拟合    tf.keras. s.Dropout(0.2),    # 分类 10份    tf.keras. s.Dense(10, activation='softmax')])# 多分类sparse_categorical_crossentropymodel.compile(optimizer='adam',              loss='sparse_categorical_crossentropy',              metrics=['accuracy'])
# 5个epochsmodel.fit(x_train, y_train, epochs=5)model.evaluate(x_test,  y_test, verbose=2)# [0.07285336476690137, 0.9783]
Train on 60000 samplesEpoch 1/560000/60000 [==============================] - 8s 132us/sample - loss: 0.3024 - accuracy: 0.9125Epoch 2/560000/60000 [==============================] - 6s 92us/sample - loss: 0.1457 - accuracy: 0.9564Epoch 3/560000/60000 [==============================] - 5s 88us/sample - loss: 0.1095 - accuracy: 0.9672Epoch 4/560000/60000 [==============================] - 5s 88us/sample - loss: 0.0900 - accuracy: 0.9730Epoch 5/560000/60000 [==============================] - 6s 92us/sample - loss: 0.0745 - accuracy: 0.976910000/1 - 1s - loss: 0.0372 - accuracy: 0.9783[0.07285336476690137, 0.9783]

收藏 打印