I had some time to kill this morning (actually I didn't have the time but I managed to kill it anyway) so I generated a set of neural nets using an old neural net tool that I have to estimate population growth. All the nets had sufficient predictive power for the short term (so a decade or two out) just based on historical population data. However this data alone is just sufficient for regression analysis (fitting a curve to data) and is just reliable for short term population projection. The next step is to factor in fertility rates and birth rates from the UN data. Hopefully this will show the population curve levelling off as the UN claims in the middle and best scenario.
So there were sixteen data elements (I replicated this data three times to provide a training set, a test set and a production set - with these few data points you have to use the data you have on hand for training and testing):
1804 1 B
1924 2 B
2012.1945 (March 15 or so 2012) 7 B
+ the UN data
here
I ended up generating a specialized (sort of proprietary) regression net that the tool suggests as a predictive net - this net has four slabs :
input (1 neuron, linear) -> slab 2 (2 neurons, Gausian) -> output (1, logistic)
-> slab 3 (2 neurons, tanh) -> output
-> slab 4 (2 neurons, Gausian complement) -> output
This is a backpropagation network with multiple strategies (those functions, linear, Gausian, tanh, etc) for "squashing" the data and reducing the error to fit the output to the observed data.
The second net is a polynomial net that was independently invented by Ivakhnenko and Barron. This is also called a GMDH (Group Method of Data Handling) network.
Interestingly, although it overshoots the data slightly (but pretty consistently) it looks like a better predictor overall. However the data below is from the somewhat exotic first net I presented because it seems to have less overall error.
Now these are basically back of the envelop calculations just with primitive neural networks:
Year and Population in thousands:
2001 6156566.283541
2002 6238136.027771
2003 6319963.535995
2004 6402046.828999
2005 6484383.894182
2006 6566972.685628
2007 6649811.124191
2008 6732897.097584
2009 6816228.460480
2010 6899803.034618
2011 6983618.608920
2012 7067672.939613
2013 7151963.750365
2014 7236488.732424
2015 7321245.544768
2016 7406231.814260
2017 7491445.135821
2018 7576883.072598
2019 7662543.156152
2020 7748422.886648
I won't go through an round off the estimates because it'll take time. This looks fairly accurate out to 2020.
Kirt
Kirt's Tibetan Translation Notes
"Even if you practice only for an hour a day with faith and inspiration, good qualities will steadily increase. Regular practice makes it easy to transform your mind. From seeing only relative truth, you will eventually reach a profound certainty in the meaning of absolute truth."
Kyabje Dilgo Khyentse Rinpoche.
"Only you can make your mind beautiful."
HH Chetsang Rinpoche
"Most all-knowing Mañjuśrī, ...
Please illuminate the radiant wisdom spirit
Of my precious Buddha nature."
HH Thinley Norbu Rinpoche