君がいるだけで by 米米クラブ

8 12 2009

君がいるだけで by 米米クラブ (Kimi ga iru dake de / If only you are here by Kome Kome Club). Popular pop song. I like it a lot.

Lyrics:

Tatoeba kimi ga iru dake de kokoro ga tsuyoku nareru koto
Nani yori taisetsu na mono wo kizukasete kureta ne

Arigachi na wana ni tsui hikikomare
Omoi mo yoranai kuyashii namida yo
Jibun no yowa sa mo shiranai kuse ni
Tsuyogari no kisha wo hashirasete ita

Meguri atta toki no you ni
Itsu made mo kawarazu iraretara
Wow wow True heart

Tatoeba kimi ga iru dake de kokoro ga tsuyoku nareru koto
Nani yori taisetsu na mono wo kizukasete kureta ne

Uragiri no kagami ni utsushi dasareta
Egao ni tsurarete nagasareta hibi
Hakanai mono e no akogare dake de
Sugu me no mae ni aru koto wo wasureteta

Naze ni motto
Sunao ni narenakatta no darou
Kimi ni made wow wow True heart

Tatoeba kimi ga iru dake de kokoro ga tsuyoku nareru koto
Nani yori taisetsu na mono wo kizukasete kureta ne

True heart tsutaerarenai True heart wakatte
True heart mienai mono wo True heart mitsumete

Tatoeba kimi ga iru dake de kokoro ga tsuyoku nareru koto
Itsu demo itsu no toki mo futari wa otagai wo mitsumeteru

Tatoeba kimi ga iru dake de kokoro ga tsuyoku nareru koto
Itsu demo itsu no toki mo futari wa otagai wo mitsumeteru

Ra ra ra ra…

Japanese Lyrics:

たとえば 君がいるだけで 心が強くなれること
何より大切なものを 気付かせてくれたね

ありがちな罠に つい引き込まれ 思いもよらない くやしい涙よ
自分の弱さも 知らないくせに 強がりの汽車を 走らせていた

めぐり逢った時のように いつまでも変わらず いられたら
wow wow True Heart

たとえば 君がいるだけで 心が強くなれること
何より大切なものを 気付かせてくれたね

裏切りの鏡に 映しだされた 笑顔につられて 流された日々
はかないものへの 憧れだけで すぐ目の前にあることを 忘れてた

なぜにもっと 素直になれなかったのだろう 君にまで
wow wow True Heart

たとえば 君がいるだけで 心が強くなれること
何より大切なものを 気付かせてくれたね

True Heart 伝えられない True Heart わかって
True Heart 見えないものを True Heart 見つめて

たとえば 君がいるだけで 心が強くなれること
いつでも いつの時も 二人は お互いを見つめてる

たとえば 君がいるだけで 心が強くなれること
いつでも いつの時も 二人は お互いを見つめてる

Advertisements




My Clustering Bibliography

4 12 2009

Agusta, Y. (2004). Minimum Message Length Mixture Modelling for Uncorrelated and Correlated Continuous Data Applied to Mutual Funds Classification, Ph.D. Thesis, School of Computer Science and Software Engineering, Monash University, Clayton, 3800 Australia

Agusta, Y. and Dowe, D.L. (2002a). MML Clustering of Continuous-Valued Data using Gaussian and t Distributions, in B. McKay and J. Slaney (eds), Lecture Notes in Artificial Intelligence 2557, Proceedings of the 15th Australian Joint Conference on Artificial Intelligence (AI02), Springer-Verlag, Berlin, Germany, pp. 143-154

Agusta, Y and Dowe, D.L. (2002b). Clustering of Gaussian and t Distribution using Minimum Message Length, in M. Sasikumar, H. J. Jayprasad and M. Kavitha (eds), Artificial Intelligence: Theory and Practice, Proceedings of the International Conference Knowledge-Based Computer Systems (KBCS-2002), Vikas Publishing House Pvt. LTD., New Delhi, India, pp. 289-299.

Agusta, Y and Dowe, D.L. (2003). Unsupervised Learning of Correlated Multivariate Gaussian Mixture Models using MML, in T.D. Gideon and L.C. Fung (eds), Lecture Notes in Artificial Intelligence 2903, Proceedings of the 16th Australian Joint Conference on Artificial Intelligence (AI03), Springer-Verlag, Berlin, Germany, pp. 477-489.

Akaike, H. (1974). A New Look at the Statistical Model Identification, IEEE Transaction on Automatic Control AC-19(6): 716-723.

Bezdek, J. C. (1981). Pattern Recognition with Fuzzy Objective Function Algoritmss, Plenum Press, New York.

Cheeseman, P. and Stutz, J. (1996). Bayesian Classification (AutoClass): Theory and Results, in U. M. Fayyad, G. Piatetsky-Shapiro, P. Smyth and R. Uthurusamy (eds), Advances in Knowledge Discovery and Data Mining, AAAI Press/MIT Press, Cambridge, MA, pp. 153-180.

Figueiredo, M. A. and Jain A.K. (2002). Unsupervised Learning of Finite Mixture Models. IEEE Transactions on Pattern Analysis and Machine Intelligence 24(3): 381-396.

Fraley, C. and Raftery, A. E. (1998). MCLUST: Software for Model-Based Cluster and Discriminant Analysis, Technical Report 342, Department of Statistics, University of Washington, Box 354322, Seattle, WA, USA.

MacQueen, J. B. (1967). Some Methods for classification and Analysis of Multivariate Observations, Proceedings of 5-th Berkeley Symposium on Mathematical Statistics and Probability, Berkeley, University of California Press, 1: 281-297.

McLachlan, G.J. and Peel, D. (2002a). On Computational Aspects of Clustering via Mixtures of Normal and t-Components, Proceedings of the American Statistical Association (Bayesian Statistical Science Section), Indianapolis, Alexandria, Virginia.

McLachlan, G.J. and Peel, D. (2002b). Finite Mixture Models, John Wiley and Sons, New York.

McLachlan, G. J., Peel, D., Basford, K. E. and Adams, P. (1999). The EMMIX Software for the Fitting of Mixtures of Normal and t-Components, Journal of Statistical Software 4(2): 1087-1092.

Miyamoto, S. and Agusta, Y. (1995). An Efficient Algorithm for L1 Fuzzy c-Means and its Termination, Control and Cybernetics 24(4): 422-436.

Miyamoto, S. and Agusta, Y. (1995). Algorithms for L1 and Lp Fuzzy C-Means and Their Convergence, in C. Hayashi, N. Oshumi, K. Yajima, Y. Tanaka, H. H. Bock and Y. Baba (eds), Data Science, Classification, and Related Methods, Springer-Verlag, Tokyo, Japan, pp. 295-302.

Miyamoto S. and Nakayama, Y. (2003). Algorithms of Hard C-Means Clustering Using Kernel Functions in Support Vector Machines, Journal of Advanced Computational Intelligence and Intelligent Informatics, Vol. 7, No. 1, pp. 19–24.

Miyamoto, S. and Suizu, D. (2003). Fuzzy C-Means Clustering Using Kernel Functions in Support Vector Machines, Journal of Advanced Computational Intelligence and Intelligent Informatics, Vol. 7, No. 1, pp. 25–30.

Neal, R. M. (1991). Bayesian Mixture Modeling by Monte Carlo Simulation, Technical Report CRG-TR-91-2, Department of Computer Science, University of Toronto, Toronto, Canada.

Pena, J. M., Lozano, J. A. and Larranaga, P. (1999). An empirical comparison of four initialization methods for the k-means algorithm. Pattern Recognition Lett., 20:1027-1040.

Schwarz, G. (1978). Estimating the Dimension of a Model, The Annals of Statistics 6: 461 – 464.

Shannon, C. E. (1948). A Mathematical Theori of Communication, Bell System Tech. Journal 27: 379-423.

Tibshirani, R., Walter, G. and Hastie, T. (2000). Estimating the Number of Clusters in a Dataset using the Gap Statistics, Technical Report 208, Department of Statistics, Stanford University, Standford, CA 94305, USA.

Wallace, C. S. (1986). An Improved Program for Classification, Proceedings of the 9th Australian Computer Science Conference (ACSC-9), Vol. 8, Monash University, Australia, pp. 357-366.

Wallace, C. S. and Boulton, D. M. (1968). An Information Measure for Classification, Computer Journal 11(2): 185-194.

Wallace, C. S. and Dowe D. L. (1994). Intrinsic Classification by MML – the Snob Program, Proceedings of the 7th Autralian Joint Conference on Artificial Intelligence (AI94), World Scientific, Singapore, pp. 37-44.

Wallace, C. S. and Dowe, D. L. (1997). MML Mixture Modelling of Multi-State, Poisson, von Mises Circular and Gaussian Distribusions, Proceedings of the 6th International Workshop on Artificial Intelligence and Statistics, Fort Launderdale, Florida, pp. 529-536.

Wallace, C. S. and Dowe, D. L. (2000). MML Clustering of Multi-state, Poisson, von Mises Circular and Gaussian Distributions, Statistics and Computing 10: 73-83.

Wallace, C. S. and Freeman, P. R. (1987). Estimation and Inference by Compact Coding, Journal of the Royal Statistical Society B 49(3): 240-265.

Related Theory

Bernardo J.M. and Smith A.F.M. (1994). Bayesian Theory, Wiley, Chichester, UK.

Dowe D.L., Oliver J.J. and Wallace C.S. (1996). MML Estimation of the Parameters of the Spherical Fisher Distribution, in S. Arikawa and A.K. Sharma (eds), Lecture Notes in Artificial Intelligence 1160, Proceedings of the 7th International Workshop on Algorithmic Learning Theory, (ALT’96), Springer-Verlag, Heidelberg, Germany, pp. 213-227.

Fitzgibbon L.J., Dowe D.L. and Allison L. (2002a). Univariate Polynomial Inference by Monte Carlo Message Length Approximation, in C. Sammut and A. Hoffmann (eds), Proceedings of 19th International Conference on Machine Learning (ICML-2002), Morgan Kaufman, Sydney, pp. 147-154.

Fitzgibbon L.J., Dowe D.L. and Allison L. (2002b). Change-Point Estimatin Using New Minimum Message Length Approximations, in M. Ishizuka and A. Sattar (eds), Lecture Notes in Artificial Intelligence 2417, Seventh Pacific Rim International Conference on Artificial Intelligence (PRICAI-2002), Springer-Verlag, Berlin, Germany, pp. 244-254.

Girolami, M. (2002). Mercel Kernel Based Clustering in Feature Space, IEEE Transactions on Neural Networks, Vol. 13, No. 3, pp. 761-766.

Lam E. (2000). Improved Approximation in MML, Honours’ Thesis, School of Computer Science and Software Engineering, Monash University, Clayton, Victoria 3800, Australia.

Lindley, D.V. (1972). Bayesian Statistics, A Review, SIAM, Philadelpia, PA, USA.

Wallace, C. S. and Dowe, D. L. (1999a). Minimum Message Length and Kolmogorov Complexity, Computer Journal 42(4): 270-283. Special issue on Kolmogorov Complexity.

Wallace, C. S. and Dowe, D. L. (1999b). Refinements of MDL and MML Coding, Computer Journal 42(4): 345-347. Special issue on Kolmogorov Complexity.