[dns-operations] .PL DNSSEC broken again
muks at mukund.org
Mon Jun 17 21:41:37 UTC 2019
On Mon, Jun 17, 2019 at 01:59:41PM +0000, Paul Vixie wrote:
> On Monday, 17 June 2019 13:09:15 UTC bert hubert wrote:
> > On Mon, Jun 17, 2019 at 04:49:34AM -0400, Viktor Dukhovni wrote:
> > > > ...
> > The problem is that from an operator point of view, DNSSEC is optional. They
> > can just turn it off. This means they _do_ hold it to a higher standard
> > because if it causes problems, they can do without it.
> the purpose of dnssec was to make dns more fragile, by increasing both the
> overall number of things that must be working in order to operate it, and the
> share of things in dns which must be working in order to operate it. anyone
> who wants reliability and is willing to trade off authenticity to get
> reliability, should not be using dnssec. i hope there is nobody like that.
Perhaps most are like that. It has been 14 years since RFC
4033/4034/4035 have been published. DNSSEC is not a separate space - it
works compatibly within the DNS. So why is adoption low? Is it because
(a) Complexity of understanding/operating DNSSEC (this has reduced over
(b) Lack of knowledge/interest
(c) Lack of software implementation
(d) Risk of operational problems (considered vs. risk of poisoning)
In my slightly dated Alexa top 500 list of domains scanned today, 16
(3.2%) are signed. Many of the top websites are quick to adopt new
technologies. It is unlikely the DNS operators managing these zones have
not heard of DNSSEC, or are incapable of signing them. What is the
factor that stops them from signing their domains?
E.g., Google is quick to deploy extensions / new algorithms into its TLS
support in its web service and Chrome. It (along with other browser
vendors) forced HTTPS usage to levels where a webpage is more likely
served over HTTPS than plain HTTP. It is forward-thinking in depoloying
other protocols, newer codecs on Youtube. 126.96.36.199 is a validating
service. Why, then, has it not signed google.com to lead by example?
Mozilla has pushed DoH (something very new) quickly into Firefox. As
an organization, it appears very security and privacy conscious with
services like the observatory.mozilla.org. Why, then, has it not signed
mozilla.com to lead by example?
I suspect many zone operators consider the risk of the services on that
name not being available due to operational mistakes/implementation
bugs, vs. the risk of poisoning attacks. Maybe, fast-forward a few
years, if poisoning grows to affect a significant amount of their
users/traffic, perhaps they'll move.
Possibly it needs something like what the browsers/search engines did to
move websites to using HTTPS by punishing them otherwise.
It'd be interesting to watch how quickly DNS transport security
(authoritative) is adopted by operators. The effect of operational
mistakes may be different and it may have a higher rate of adoption.
More information about the dns-operations