Kimoto's Gravity Well
-
[quote name=“Explicit” post=“56005” timestamp=“1391078378”]
I am unable to find anything specific on “Kimoto’s Gravity Well”; logic and/or math behind it. Every single article/post Ive found just says it is OK, it is a “problem solver” etc. Id love to see how it approaches the problem. Anyone found a more specific description of this Gravity Well, other than that it re-adjusts diff on a single block basis.In my opinion difficulty adjustment as it is implemented now works reasonably well.
Regarding the incoming ASICs. Perhaps someone should organize a group buy of asic boards so that we can counterweight the increased hash power of the multi pools.
[/quote]+1 on this one
Btw, a colleague mentioned another solution. Wouldn’t a faster retarget time solve the problem with multipoolers that ruin the network? Sure you get the burst, but when they leave things go back to normal pretty quickly.
Not sure if this would have any other backsides tho…
-
Wish I could show the links with good info on the Megacoin forum, but the forum and domains are being moved from left to right and I can’t reach the site anymore.
Shortest link to where and how to find more info is via this link: https://twitter.com/aantonop/status/428269040885190657
Check Alienwalkerx’s reply for links. If they lead anywhere…Also, this is what Kimoto posted on Bitcointalk.org:
[quote=Kimoto]
Kimoto Gravity Well difficulty algorithm this month grew from 2 to 12 coins.
Releasing a technical gravity guide shortly so all coins receive a fair chance at life
[/quote] -
Perception of value is only the Danger it is pegged to…
-
Thanks MegaMike. I checked the links on Twitter. The most informative, but not really insightful, I could find was http://forum.megacoin.in/index.php?topic=2791.msg9395#msg9395, which lists various Kimoto Grawity Well configurations. In case of MegaCoin it is:
[quote]
Megacoin
Date Added: September 13, 2013 (~3.5 months in)
http://megacoin.co.nz/
EventHorizonDeviation = 1 + (0.7084 * pow((double(PastBlocksMass)/double(144)), -1.228));
BlocksTargetSpacing = 2.5 * 60; // 2.5 minutes
PastSecondsMin = TimeDaySeconds * 0.25;
PastSecondsMax = TimeDaySeconds * 7;
[/quote]This leaves me clueless since there are no explanations of what PastBlockMass is, how it is calculated etc., what do the numbers in EventHorizonDeviation actually mean… I am not even convinced that I am looking at the real thing as it looks too simple to work as intended.
Briefly inspecting MegaCoin difficulty adjustments it can be concluded that it works OK. And so does the diff adjustment in the case of the Feathercoin, just in a bit less granular way.
http://www.coinwarz.com/difficulty-charts/megacoin-difficulty-chart
http://www.coinwarz.com/difficulty-charts/feathercoin-difficulty-chartI don’t think that at this point â€Å"a bit less granular way†of diff adjustment is a problem. On the other hand it may become a problem, even a major one, if multi-pools become very strong and hit us with terra hashes at one moment, and leave after the difficulty rises above certain level. With a substantial drop in hash rate even solving a single block could be a major headache.
Would I vote for implementing Kimotos’ Gravity Well? I’d like to understand how it works first, and see how it fits the pessimistic scenario of rapid and high-magnitude hash-rate changes.
Maybe we should be worried about the scrypt asic dominated future, and see if we can come up with a suitable algorithm on our own (not simply implementing what others have already done), perhaps designing an algorithm that is very sensitive to increased network hash rate (so that we only waste a couple of blocks when a multi-pool hits us with terra hashes), and on the other hand reacts super quickly when multi-pool terra hashes are gone (perhaps using a failsafe if a block is not solved within a certain time frame*)
* This could provide incentive to a powerful miner to try to solve a couple of blocks (or even e single block) after the failsafe kicks in. -
[quote name=“vizay” post=“56033” timestamp=“1391087617”]
Btw, a colleague mentioned another solution. Wouldn’t a faster retarget time solve the problem with multipoolers that ruin the network? Sure you get the burst, but when they leave things go back to normal pretty quickly.Not sure if this would have any other backsides tho…
[/quote]Yes and then you’ll have to do it again.
It never ends. You have to spank em.
-
[quote name=“zerodrama” post=“56284” timestamp=“1391180022”]
[quote author=vizay link=topic=7305.msg56033#msg56033 date=1391087617]
Btw, a colleague mentioned another solution. Wouldn’t a faster retarget time solve the problem with multipoolers that ruin the network? Sure you get the burst, but when they leave things go back to normal pretty quickly.Not sure if this would have any other backsides tho…
[/quote]Yes and then you’ll have to do it again.
It never ends. You have to spank em.
[/quote]You Spanker you. ;)
I bet you have a paddle under the bed. -
[quote name=“Explicit” post=“56268” timestamp=“1391174697”]
Thanks MegaMike. I checked the links on Twitter. The most informative, but not really insightful, I could find was http://forum.megacoin.in/index.php?topic=2791.msg9395#msg9395, which lists various Kimoto Grawity Well configurations. In case of MegaCoin it is:
[quote]
Megacoin
Date Added: September 13, 2013 (~3.5 months in)
http://megacoin.co.nz/
EventHorizonDeviation = 1 + (0.7084 * pow((double(PastBlocksMass)/double(144)), -1.228));
BlocksTargetSpacing = 2.5 * 60; // 2.5 minutes
PastSecondsMin = TimeDaySeconds * 0.25;
PastSecondsMax = TimeDaySeconds * 7;
[/quote]This leaves me clueless since there are no explanations of what PastBlockMass is, how it is calculated etc., what do the numbers in EventHorizonDeviation actually mean… I am not even convinced that I am looking at the real thing as it looks too simple to work as intended.
Briefly inspecting MegaCoin difficulty adjustments it can be concluded that it works OK. And so does the diff adjustment in the case of the Feathercoin, just in a bit less granular way.
http://www.coinwarz.com/difficulty-charts/megacoin-difficulty-chart
http://www.coinwarz.com/difficulty-charts/feathercoin-difficulty-chartI don’t think that at this point â€Å"a bit less granular way†of diff adjustment is a problem. On the other hand it may become a problem, even a major one, if multi-pools become very strong and hit us with terra hashes at one moment, and leave after the difficulty rises above certain level. With a substantial drop in hash rate even solving a single block could be a major headache.
Would I vote for implementing Kimotos’ Gravity Well? I’d like to understand how it works first, and see how it fits the pessimistic scenario of rapid and high-magnitude hash-rate changes.
Maybe we should be worried about the scrypt asic dominated future, and see if we can come up with a suitable algorithm on our own (not simply implementing what others have already done), perhaps designing an algorithm that is very sensitive to increased network hash rate (so that we only waste a couple of blocks when a multi-pool hits us with terra hashes), and on the other hand reacts super quickly when multi-pool terra hashes are gone (perhaps using a failsafe if a block is not solved within a certain time frame*)
* This could provide incentive to a powerful miner to try to solve a couple of blocks (or even e single block) after the failsafe kicks in.
[/quote]Sorry I can’t give you more information. Especially the technical details but hopefully Kimoto will release that soon. The Megacoin website is still a mess though. But I’ll keep an eye on it and let you know when and how the tech details are released.
The graphs are cool, by the way! Thanks for those.
I think if you look at one of the coins who recently added the Gravity Well, Franko, you can fairly easy see how it benefits them:
http://www.coinwarz.com/difficulty-charts/franko-difficulty-chart
The long periods of high difficulty are gone, and when a spike appears, it’s more vicious, basically meaning the time any multi pool hits will be shortened due to the fast and furious growth of the difficulty, effectively using their own massive power against them. Also more spikes, so yeah, there is a downside, but overall it certainly gives a real benefit, and that’s the way I’ve experienced it when it was implemented in Megacoin, no more mining for hours to work away somebody else’s difficulty :)Of course you need to think for yourself what would be best suited for Feathercoin.
[edit]Forum came just online, so here the direct links (minus the tech doc):
The Newbie’s Guide to Kimoto’s Gravity Well : https://forum.megacoin.in/index.php?topic=893.0
Kimoto Gravity Well Configurations : https://forum.megacoin.in/index.php?topic=2791.0 -
Its looks a good idea. It is similar to some things we are considering anyway.
-
[quote name=“wrapper” post=“56437” timestamp=“1391263527”]
Its looks a good idea. It is similar to some things we are considering anyway.
[/quote]I think it tries to address the wrong problem.
-
[quote name=“zerodrama” post=“56504” timestamp=“1391289676”]
[quote author=wrapper link=topic=7305.msg56437#msg56437 date=1391263527]
Its looks a good idea. It is similar to some things we are considering anyway.
[/quote]I think it tries to address the wrong problem.
[/quote]People are not going to spend these coins - any of them. Like an implanted memory of their own to coin Blade Runner the success of BTC is completely tied to a memory of it’s success at this point and we need to act accordingly. The Well addresses one of the issues, but perception is good old marketing. If FTC had a Super Bowl commercial would our value shoot up if not only for the reason of perception? The Original protocol for currency was a metal and a piece of paper. And people built off that and got creative…
-
[quote name=“Horizon” post=“56547” timestamp=“1391325396”]
[quote author=zerodrama link=topic=7305.msg56504#msg56504 date=1391289676]
[quote author=wrapper link=topic=7305.msg56437#msg56437 date=1391263527]
Its looks a good idea. It is similar to some things we are considering anyway.
[/quote]I think it tries to address the wrong problem.
[/quote]People are not going to spend these coins - any of them. Like an implanted memory of their own to coin Blade Runner the success of BTC is completely tied to a memory of it’s success at this point and we need to act accordingly. The Well addresses one of the issues, but perception is good old marketing. If FTC had a Super Bowl commercial would our value shoot up if not only for the reason of perception? The Original protocol for currency was a metal and a piece of paper. And people built off that and got creative…
[/quote]The community in the long run is the life of the coin. Every coin has proved this principle. The more black magic there is in the code, the less the community is able to adapt and participate. This effectively kills the coin or makes it the province of established owners of the old system. SEE New York City HEARINGS. They don’t understand it. They are afraid of it. They can’t even compromise because it is all technowizardry to them. And yeah they’re also trying to own it.
-
[quote name=“Horizon” post=“56547” timestamp=“1391325396”]
[quote author=zerodrama link=topic=7305.msg56504#msg56504 date=1391289676]
[quote author=wrapper link=topic=7305.msg56437#msg56437 date=1391263527]
Its looks a good idea. It is similar to some things we are considering anyway.
[/quote]I think it tries to address the wrong problem.
[/quote]People are not going to spend these coins - any of them. Like an implanted memory of their own to coin Blade Runner the success of BTC is completely tied to a memory of it’s success at this point and we need to act accordingly. The Well addresses one of the issues, but perception is good old marketing. If FTC had a Super Bowl commercial would our value shoot up if not only for the reason of perception? The Original protocol for currency was a metal and a piece of paper. And people built off that and got creative…
[/quote]This is a good thread discussion on Feathercoin network protection techniques, ie Kimoto gravity well. That is a diversion.
What is a “Super Bowl”, some American - Giant Kornflakes container? -
[quote name=“zerodrama” post=“56596” timestamp=“1391344444”]
Kimoto’s Gravity Well
I think it tries to address the wrong problem.
[/quote]I agree, community monitoring (condition based maintenance) will be an important factor, in preventing / detecting attacks.
Particularly in the future when a number of coins are established world currency.Untill then, it is obvious to me, if not the Bitcoin “head in the sands”, attacks on the altcoins will evolve / has lead / to successful attacks on Bitcoin and Litecoin.
We have a defence for many attacks they don’t, and they are relying on network dominance, mistake.
However, ACP has its own draw backs and we would like to continue to try ideas to replace that. Particularly as it will be lees necessary if we take alternate methods eg Kickstarter cheap ASICS for merchants. We need to concentrate on the attacks that are actually happening, eg. evil Multipools.
-
I think it would be a great IDEA to implement the Kimoto’s Gravity Well
-
If someone supplies a patch,Kimoto’s Gravity Well, we can set up a test.
-
[quote name=“wrapper” post=“56620” timestamp=“1391349307”]
If someone supplies a patch,Kimoto’s Gravity Well, we can set up a test.
[/quote]Problem is you can’t explain Kimoto to the average person. How are you going to explain it to regulators? It’s black magic for a problem caused by pools. Kimoto won’t kill an attack by zealots.
The more black magic we have, the less participation there is in the software, which means eventually the policies.
-
[quote name=“zerodrama” post=“56689” timestamp=“1391373448”]
[quote author=wrapper link=topic=7305.msg56620#msg56620 date=1391349307]
If someone supplies a patch,Kimoto’s Gravity Well, we can set up a test.
[/quote]The more black magic we have, the less participation there is in the software, which means eventually the policies.
[/quote]I’ve now looked into the Gravity Well quite a bit. It is very much fulfilling our requirement to re-targets based on the rate of change between the short and long block time averages. I actually think it is very brilliant.
A number of other coins are employing it specifically against multipool leaching, and don’t seem to have, forking or other validation issues which may have concerned us more (with a larger network and more miners to update).
The only point against is it is effectively retargeting at each block, so it might be worth trying that (easy solution) first.
It effectively damps the block difficulty changes, so our current granulated difficulty change may become superfluous.[quote]
unsigned int static KimotoGravityWell(const CBlockIndex* pindexLast, const CBlockHeader *pblock, uint64 TargetBlocksSpacingSeconds, uint64 PastBlocksMin, uint64 PastBlocksMax) {
/* current difficulty formula, megacoin - kimoto gravity well */
const CBlockIndex *BlockLastSolved = pindexLast;
const CBlockIndex *BlockReading = pindexLast;
const CBlockHeader *BlockCreating = pblock;
BlockCreating = BlockCreating;
uint64 PastBlocksMass = 0;
int64 PastRateActualSeconds = 0;
int64 PastRateTargetSeconds = 0;
double PastRateAdjustmentRatio = double(1);
CBigNum PastDifficultyAverage;
CBigNum PastDifficultyAveragePrev;
double EventHorizonDeviation;
double EventHorizonDeviationFast;
double EventHorizonDeviationSlow;if (BlockLastSolved == NULL || BlockLastSolved->nHeight == 0 || (uint64)BlockLastSolved->nHeight < PastBlocksMin) { return bnProofOfWorkLimit.GetCompact(); }
for (unsigned int i = 1; BlockReading && BlockReading->nHeight > 0; i++) {
if (PastBlocksMax > 0 && i > PastBlocksMax) { break; }
PastBlocksMass++;if (i == 1) { PastDifficultyAverage.SetCompact(BlockReading->nBits); }
else { PastDifficultyAverage = ((CBigNum().SetCompact(BlockReading->nBits) - PastDifficultyAveragePrev) / i) + PastDifficultyAveragePrev; }
PastDifficultyAveragePrev = PastDifficultyAverage;PastRateActualSeconds = BlockLastSolved->GetBlockTime() - BlockReading->GetBlockTime();
PastRateTargetSeconds = TargetBlocksSpacingSeconds * PastBlocksMass;
PastRateAdjustmentRatio = double(1);
if (PastRateActualSeconds < 0) { PastRateActualSeconds = 0; }
if (PastRateActualSeconds != 0 && PastRateTargetSeconds != 0) {
PastRateAdjustmentRatio = double(PastRateTargetSeconds) / double(PastRateActualSeconds);
}
EventHorizonDeviation = 1 + (0.7084 * pow((double(PastBlocksMass)/double(144)), -1.228));
EventHorizonDeviationFast = EventHorizonDeviation;
EventHorizonDeviationSlow = 1 / EventHorizonDeviation;if (PastBlocksMass >= PastBlocksMin) {
if ((PastRateAdjustmentRatio -
[quote name=“FrankoIsFreedom” post=“57385” timestamp=“1391653264”]
Its pretty easy to explain. There is a block target, an upper limit and a lower limit. The more you deviate from the target towards the limits the greater the change in difficulty.As you can see from http://www.coinwarz.com/difficulty-charts/franko-difficulty-chart
Franko was suffering really bad from the multipools, we had a handful of dedicated miners and a ton of other miners who wanted to mine but couldn’t afford to. Since the update, all those little spikes on the difficulty chart are multipools being forced to mine at a more fair rate according to their hashing power. Since the update, even with spikes to 4+ difficulty we have maintained our block target average of 30 seconds. An amazing change for us, there were times where frk blocks were taking over an hour. I’m personally glad those days are over and it seems other frk supporters are too.
[/quote]Well, your previous re-target algorithm was not good (4.0 difficulty limiter like Litecoin). After looking at the graphs below, I must say Kimoto’s Gravity Well isn’t as good as it may seem. Your last week’s difficulty median is 1.8, but there are many spikes, a couple even over 4.0. PXC and FTC share the same basic re-target algorithm with some differences in parameters. Both don’t deviate from their median difficulty by over 20%. There is nothing to worry about.
[img]http://phoenixcoin.org/archive/frk_pxc_ftc_diff.png[/img]
-
Oh guys, firstly, this thing is overcomplicated.
Why use this sophisticated approach when we already have very responsive and smooth diff adjustment algorithm designed by PPC’s developers?The second point is more technical.
NEVER EVER use floating point math in decentralized software.
That’s because of floating-point accuracy problem.
It means that different hardware may perform the same calculations with (slightly) different results.
This will permanently destroy the network consensus in a way similar to Bitcoin’s March fork due to BDB limitations.
The larger the network grows the more likely this will happen.Please consider this carefully.
-
[quote name=“RoadTrain” post=“58819” timestamp=“1392253809”]
The second point is more technical.
NEVER EVER use floating point math in decentralized software.
That’s because of floating-point accuracy problem.
It means that different hardware may perform the same calculations with (slightly) different results.
This will permanently destroy the network consensus in a way similar to Bitcoin’s March fork due to BDB limitations.
The larger the network grows the more likely this will happen.
[/quote]He’s 110% correct.
Floating point arithmetic is exact when integers having less bits than the mantissa are involved, but that’s about the only guarantee you can realistically make. You can’t, for instance, say that you’ll get the same result as using fixed precision math: If you divide one floating point value by another, and now you add the result multiple times, you will likely obtain different results than if you had used fixed-point arithmetic (unless it was an exact division).
With all the new devices coming online, most won’t even have an FPU, and expecting them to follow IEEE-754 is just going to disappoint you. But even on the same architecture, your choice of compiler flags can result in different operations, like with -fast with the Sun Studio compiler, or -ffast-math in gcc. Did you know that many Linux Gentoo users use -ffast-math by default? #truestory
Don’t believe me? Try this code out:
[code]
float a = 1.f / 81;
float b = 0;
for (int i = 0; i < 729; ++ i)
b += a;
printf(“%.7g\n”, b); // prints 9.000023
[/code]…and yet…
[code]
double a = 1.0 / 81;
double b = 0;
for (int i = 0; i < 729; ++ i)
b += a;
printf(“%.15g\n”, b); // prints 8.99999999999996
[/code]Which one is correct? The answer is they both are, in accordance with the standard.
Fortunately in that particular bit of code they never == 0.0, because if you had I can tell you it generally won’t even when you think it should.
+1 rep for you.