Amorphous silicon (aSi:H) flat panel detectors are prevalent in radiotherapy for megavoltage imaging tasks. Any clinical and dosimetrical application requires a well-defined dose response of the system to achieve meaningful results. Due to radiation damages, panels deteriorate and the linearity of pixel response to dose as well as the stability with regard to changing operating temperatures get worse with time. Using a single level gain correction can lead to an error of about 23% when irradiating a flood field image with 100 MU min(-1) on an old detector. A multi-level gain (MLG) correction is introduced, emending the nonlinearities and subpanel-related artifacts caused by insufficient radiation hardness of amplifiers in the read-out electronics. With rising temperature, offset values typically increase (up to 300 gray values) while the response at higher dose values per frame remain constant for a majority of pixels. To account for temperature-related image artifacts, two additional temperature correction methods have been developed. MLG in combination with temperature corrections can re-establish the aSi:H image quality to the performance required by reliable medical verification tools. Furthermore, the life span and recalibration intervals of these costly devices can be prolonged decisively.