|
[1] Isbell, C. L., et al. “Cobot in LambdaMOO: A social statistics agent.”, AAAI/IAAI. 2000. p. 36-41. [2] WANG, H., et al. “A Dataset for Research on Short-Text Conversations.”, EMNLP. 2013. p. 935-945. [3] Vinyals, O., & Le, Q. “A neural conversational model.”, arXiv preprint arXiv:1506.05869, 2015. [4] Shang, L., Lu, Z., and Li, H. “Neural responding machine for short-text conversation.”, arXiv preprint arXiv:1503.02364, 2015. [5] Serban, I. V., et al. “Building End-To-End Dialogue Systems Using Generative Hierarchical Neural Network Models.”, AAAI. 2016. p. 3776-3784. [6] Li, J, et al. “A diversity-promoting objective function for neural conversation models.”, arXiv preprint arXiv:1510.03055, 2015. [7] Jieba. [online] Available at: https://github.com/fxsjy/jieba [Accessed 10 JAN. 2017]. [8] Understanding LSTM Networks. [online] Available at: http://colah.github.io/posts/2015-08-Understanding-LSTMs [Accessed 12 MAR. 2017]. [9] Williams, R. J., and Zipser, D. “Gradient-based learning algorithms for recurrent networks and their computational complexity.”, Theory, architectures, and applications, 1995, 1: 433-486. [10] Werbos, P. J. “Generalization of backpropagation with application to a recurrent gas market model.”, Neural networks, 1988, 1.4: 339-356. [11] Robinson, A. J. and Fallside, F., “The utility driven dynamic error propagation network”, Cambridge University Engineering Department, 1987. [12] Hochreiter, S., “Untersuchungen zu dynamischen neuronalen Netzen.“, Diploma thesis, Institut fur Informatik, Lehrstuhl Prof, Brauer, Technische Universitat Munchen, 1991. [13] Hochreiter, S., and Schmidhuber, J. “Long short-term memory.”, Neural computation, 1997, 9.8: 1735-1780. [14] Bengio, Y., Simard, P., and Frasconi, P. “Learning long-term dependencies with gradient descent is difficult.”, IEEE transactions on neural networks, 1994, 5.2: 157-166. [15] Cho, K., et al. “Learning phrase representations using RNN encoder-decoder for statistical machine translation.”, arXiv preprint arXiv:1406.1078, 2014. [16] Jozefowicz, R., Zaremba, W., and Sutskever, I.. “An empirical exploration of recurrent network architectures.", Proceedings of the 32nd International Conference on Machine Learning (ICML-15). 2015. p. 2342-2350. [17] Sutskever, I., Vinyals, O., and Le, Q. V. “Sequence to sequence learning with neural networks.”, Advances in neural information processing systems. 2014. p. 3104-3112. [18] Bahdanau, D., Cho, K., and Bengio, Y. Neural machine translation by jointly learning to align and translate. arXiv preprint arXiv:1409.0473, 2014. [19] Wu, S. H., et al. “CYUT Short Text Conversation System for NTCIR-12 STC.”, NTCIR. 2016. [20] Sequence to sequence Models. [online] Available at: https://www.tensorflow.org/tutorials/seq2seq [Accessed 22 FEB. 2017] [21] Apache Lucene. [online] Available at: http://lucene.apache.org [Accessed 17 APR. 2017]
|