I recreated the chart in Georgia (as I live in Atlanta). I'm assuming it is similar rules. What's the interval used to calculate the marginal rates? The only thing I can figure is that it must be pretty high. Calculating changes at $1,000 intervals create marginal rates in excess of 900% at two different points.
Maybe I am modeling this incorrectly. I have read the rules that Medicaid and PeachCare phase out at specific income amounts. In my model, you gain a $1,000 of benefits, you lose $9,000 and $12,000 of benefits at two points. For a single mother of two(the assumed household in the model), that is indeed a hell of marginal tax hike.