I love Japan, always been fascinated with the country before I got here and more so than ever now.
I heard from some people that the more you can understand what's going on around you, the less you like Japan. Is that true? And why...?
Thanks
Shell
Hi I am new here.
I always had this image that Japan was really safe and kinda surprised to read this.
I have visited Japan twice in winter to visit my friend who lives in Gifu Prefecture, and was always surprised at how skiers and snowboarders leave all their gear unlocked when they go for lunch and stuff. What a luxury.
It's a pity if Japan is changing for the worse.