View Single Post
  #55  
Old August 24th, 2004, 12:21 PM
Lictor
external usenet poster
 
Posts: n/a
Default

"Paul" wrote in message
...
If you use a scale and don't cheat you can be very accurate in counting
calories for home-prepared foods.


I don't think so. You're assuming the values for the base components are
accurate when they're not. Does your calorie table distinguish between a
Golden apple and a Granny Smith? Does it make a difference between
spring-summer beef (fed with thick grass, hence fatter) and winter beef
(slimmer)? Does it list both American corn and traditionnal corn (less
sugar)? Does it differenciate between African and Carribean bananas? If it
doesn't, you can't claim to have much accuracy, you're just getting a rough
estimate.
Moreover, the calories per gram are themselves estimates, rounded to the
nearest digit. A gram of protein is not 4 calories, it's 4.2. A gram of
carbs is 3.74, not 4. Likewise, fat is 9.3 per gram. A lot of sources do not
even take that into account, and that's almost a 10% margin of error
already.
Obviously, some people do not understand that. I mean, I see people posting
their daily diet down to the single digit. That's a level of precision that
is just impossible to get.

Further, if you do the above in
conjunction with a nutrition analysis website or software or even a

logbook
you can tweak your intake ratios very easily. I use Fitday and find it
extremely helpful in tailoring my diets to whatever stage of training I'm
in, i.e., cutting, bulking, etc.


This means you have to plan your meal in advance. It makes it hard to eat
daily at the restaurant or at friends. It means weighting your food
precisely. Otherwise, you will only have a very rough estimate of the
calories you ate.
Compared to the brain natural ability to count calories, it's a very very
weak substitute. Your taste buds are able to distinguish sugar content down
to the single calorie (like, sorting several tea cups per sugar content).
Your brain has a four way feedback to adjust for your food intake (real-time
estimation, 15 minutes mark during the meal, from meal to meal, from week to
week).
I would rather devote energy to restore proper function of that fantastic
tool rather than waste it on trying to make a shaky substitute work.

People who say that calorie counting doesn't work aren't doing it right

or
are cheating or both. Outside of crap fad diet books no one says that

it's
easy. It takes discipline and commitment for the long haul.


Calorie counting works to lose weight, because the deficit is high enough to
absorb the lack of precision. If you burn 2400 calories a day and are trying
to eat 1500, even a +30% error keeps you in deficit. So, it does work during
that phase. The problem comes when you're tying to maintain weight. That 30%
margin of error then becomes enormous.
It does take more discipline and commitment to tune your own brain; there is
no software to help you and you can't press F1 for help. But I still feel
the reward is worth the extra troubles compared to calorie counting, in
terms of results, stability and flexibility.

And as an aside I'll add that every calorie requirement estimator I've
ever seen has been way off. I'd start by knocking 500 off whatever number
they give you and even at that it's a safe bet that it'll still be too
high.


That's precisely what I have been saying... As you have noticed, the whole
process has a wide margin of error. The "correction" you're making doesn't
correct that error, it just makes sure you always end up in deficit. That's
fine for losing weight, but you can't count on it for maintaining.
If you did any chemistry or physics, you know what to think of these
experiments where you "just add/substract random number, and it works
exactly as in the formula, so it's a success"...