Scientists have renamed global warming to global climate change because the affects of it won't only warm the earth, but affect the climate completely, just to let you know. But anyway, it is very much evident that the climates of our earth are changing quite drastically but there is no actual evidence that links it to humans. Some people say it's natural and it happens all the time to the earth every now and then while others think that it is related to humans putting more carbondioxide in the air along with other chemicals that causes greenhouse gas. Well at least that's what I know so far. So to answer your first question, yes, the earth's climates are changing, just look at past history records of the regular climates in your area or other areas. You'd see there has been a lot of change. But to answer your second question, there hasn't actually been proof.