There is an article in my magazine debating whether or not we should worry about the changes Hollywood makes to its films concerning history. I thought it was quite interesting and I could see where both sides are coming from. I agree that it is irritating when films are made which are completely historically incorrect. However, surely if people believe that the movies are telling what really happen that really isn't Hollywoods fault. Hollywood are just making what they think will sell so of course they will change things. Surely if people are believing them isn't that the fault of a poor education system? If people don't know their history then of course they are going to believe what they see in the movies. I don't think history is well presented in the education system. I think that it is easier to change what is being taught in schools than it is to get Hollywood to portray an accurate account of the past in their movies.