The mass loss rates of red supergiants (RSGs) govern their evolution towards supernova and dictate the appearance of the resulting explosion. To study how mass-loss rates change with evolution we have measured the mass-loss rates and extinctions of RSGs, in a sample of clusters in the MW and LMC. By targeting stars in coeval clusters we are able to study the mass-loss rate evolution whilst keeping the variables of mass and metallicity fixed. Our results indicate that there is little justification for substantially increasing the mass loss rates during the RSG phase, as has been suggested recently in order to explain the absence of high mass Type IIP supernova progenitors. We next study the most evolved star in each cluster and ask the question, if this star were to explode tomorrow, what would we infer about it's initial mass from available photometry? By comparing initial mass estimates from both photometric data and from isochrone fitting we have found that using pre-explosion photometry may lead to the masses found being significantly underestimated. We suggest that this is due to increased levels of circumstellar extinction present at the end of the RSGs lives causing the star to appear less luminous, hence leading to a lower mass being inferred.