What is the meaning of the word Hollywood?
Hollywood is the place of movie studios and stars - the ultimate district of American cinema. It is the district in Los Angeles California that attracts millions of tourists a year. Everyone "knows" what Hollywood is : but do you know the true meaningbehind the word Hollywood?
Comments
Post a Comment