|
[1] Frey.B.J. (2007), “Clustering by passing messages between data points,” Science. pp. 972-976. [2] Wang.C.D., Lai.J.H., Suen.C.Y., and Zhu.J.Y.(2013), “Multi-exemplar affinity propagation,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 35, pp. 2223-2237. [3] Zhang.X.L., Wang.W., Norvag.K., Sebag.M.(2011), “K-AP: Generating Specified K Clusters by Efficient Affinity Propagation,” 2010 IEEE International Conference on Data Mining. [4] Wang.Y. and Chen.L.(2015), “K-MEAP: Multiple Exemplars Affinity Propagation With Specified K Clusters,” IEEE Transactions on Neural Networks and Learning System, vol. 27, pp. 2670-2682. [5] Goodfellow.L.J., Pouget-Abadie.J., Mirza.M. and Xu.B.(2014), “Generative Adversarial Networks,” arXiv:1406.2661, Cornell University. [6] Billard.L.(2006), “Symbolic data analysis: what is it?,” Compstat 2006 - Proceedings in Computational Statistics pp 261-269. [7] Mahalanobis.P.C.(1936), “On the generalised distance in statistics,” India,. [8] Bock.H.H. and Diday.E.(2000), “Analysis of Symbolic Data,” Exploratory Methods for Extracting Statistical Information from Complex Data,” Springer. [9] Bertrand.P. and Goupil.F.(2000), “Descriptive Statistics for Symbolic Data,” Analysis of Symbolic Data, pp 106-124. [10] Billard.L.(2007), “Dependencies and Variation, Components of Symbolic Interval-Valued Data,” Selected Contributions in Data Analysis and Classification, pp 3-12. [11] Carvalho.F.D and Lechevallier.Y.(2009), “Dynamic Clustering of Interval-Valued Data Based on Adaptive Quadratic Distances,” IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans pp. 1295-1306. [12] Arjovsky.M., Chintala.S. and Bottou.L.(2017), “Wasserstein GAN,” arXiv:1701.07875, Cornell University. [13] https://thingspeak.com/, Cloud service platform. [14] https://erdb.epa.gov.tw, Environmental Protection Administration environmental resource database. [15] Li.Y.C.(2018), “Improved AP Clustering and Its Application on Distributional-Valued Variable,” National Formosa University. [16] Jain.A.K.(2010), “Data clustering: 50 years beyond K-means,” Pattern Recognit. Lett., vol. 31, no. 8, pp. 651-666. [17] Xie.B., Wang.M., and Tao.D.(2011), “Toward the optimization of normalized graph Laplacian,” IEEE Trans. Neural Netw., vol. 22, no. 4, pp. 660-666. [18] He.Z., Xie.S., Zdunek.R., Zhou.G., and Cichocki.A.(2011), “Symmetric nonnegative matrix factorization: Algorithms and applications to probabilistic clustering,” IEEE Trans. Neural Netw, vol. 22, no. 12, pp. 2117-2131. [19] Huang.X., Ye.Y., and Zhang.Z.(2014), “Extensions of kmeans-type algorithms: A new clustering framework by integrating intracluster compactness and intercluster separation,” IEEE Trans. Neural Netw. Learn. Syst., vol. 25, no. 8, pp. 1433–1446. [20] Jing.L., Ng.M.K., and Zeng.T.(2013), “Dictionary learning-based subspace structure identification in spectral clustering,” IEEE Trans. Neural Netw. Learn. Syst., vol. 24, no. 8, pp. 1188–1199. [21] Hou.C.,Nie.F.,Yi.D., and Tao.D.(2015), “Discriminative embedded clustering: A framework for grouping high-dimensional data,” IEEE Trans. Neural Netw. Learn. Syst., vol. 26, no. 6, pp. 1287–1299. [22] Ta¸sdemir.K., Milenov.P., and Tapsall.B.(2011), “Topology-based hierarchical clustering of self-organizing maps,” IEEE Trans. Neural Netw., vol. 22, no. 3, pp. 474–485. [23] Kingma.D.P. and Welling.M.(2014), “Auto-Encoding Variational Bayes,” In proceedings of the International Conference on Learning Representations (ICLR). [24] Radford.A., Metz.L. and Chintala.S.(2016), “Unsupervised Representation Learning with Deep Convolutional Generative Adversarial Networks,” arXiv:1511.06434, Cornell University. [25] Zhu.J.Y, Park.T., Isola.P., and Efros.A.A.(2018), “Unpaired Image-to-Image Translation using Cycle-Consistent Adversarial Networks,” arXiv:1703.10593, Cornell University. [26] Smith.E. and Meger.D.(2017), “Improved Adversarial Systems for 3D Object Generation and Reconstruction,” arXiv:1707.09557, Cornell University. [27] Wang.Z.Q.(2018), “Study on Distributional-Valued Variable Analysis Using Machine Learning Approached,” National Formosa University. [28] Krizhevsky.A., Sutskever.I.(2012), and Geoffrey E. Hinton, “ImageNet Classification with Deep Convolutional Neural Networks,” Advances in Neural Information Processing System 25 (NIPS).
|