Tuesday, January 18, 2011

The tax-rates-affect-hiring-decisions fallacy

If you listened to the debate about whether to let the temporary tax breaks expire on earners in the top income bracket, you might have heard an absurd assertion over and over again. The assertion is that higher tax rates discourage hiring. Now, there are many, many ways to demonstrate this is an absurd assertion--my favorite is simply looking at the empirical evidence of historic unemployment rates vs. top marginal tax rates. (Conclusion: unemployment is actually lower historically during the periods with highest marginal tax rates.) However, in this post I want to point out a basic fallacy in the argument that is commonly presented to "prove" higher tax rates discourage hiring.


The argument goes something like this: When business owners and managers decide whether or not to hire a new employee, they look at how much extra income they expect to make from that employee and compare it to how much the employee will cost the company. If the employee is expected to result in net income to the company after considering costs, then the employee will probably be hired. Of course, most companies will expect a certain minimum level of expected profit to justify the risk being taken, and many times the numbers aren't exactly precise calculations on paper, but more a general sense of expected returns and costs. But the basic idea is always there.

So, the argument continues, if we reduce the expected return by taxing some of that additional income, then we reduce the incentive to hire, and fewer people get hired.

To use an example, if hiring an employee is expected to cost $100,000, and that employee is expected to bring in $150,000, the net income is expected to be $50,000. Of course, there's a risk the employee won't pay off, and the decision to hire will actually cost the company money. But assume in this case that the expected return of $50,000 is worth the risk. However, if a tax rate of 50% is in place, then the $50,000 in net income will actually only result in $25,000 income to the owner, not $50,000. And in this example, the owner might decide the risk of losing $100k by hiring a worthless employee is too great for a return of only $25,000, even though the risk would have been worth taking for an expected return of $50,000. Makes sense up to a point, right?

But here's the problem: The above argument and example don't consider that taxes also reduce the real cost of hiring an employee. If the employee costs $100k, and generates $0 additional income, then the employer simply deducts the $100k from his taxes and (at 50% rates) sees a $50k reduction in his taxes. So the analysis should actually look like this:

With no taxes, the employer is risking $100k for an expected net return of $50k. That's earnings of 50% of the investment ($50k earnings/$100k cost). But with taxes of 50%, the employer who incurs $100k of costs to hire an employee has actually only incurred a real cost of $50k because of the tax deduction available. The expected net return is now only $25k as a result of taxes. But the expected real earnings remains 50% ($25k net earnings/$50k real cost).

This analysis holds regardless of what tax rate you use. Increasing or decreasing taxes does nothing to change the fundamental Return On Investment ratio that owners, investors, or managers will use in making hiring decisions. Anybody who attempts to claim increasing taxes will reduce incentives to hire employees using the argument outlined above is either being very dishonest or very foolish.

0 Comments:

Post a Comment

<< Home