How to Create a Successful SEO Content Strategy
The enormous number of people with "SEO" as their job title really should put the lie to the idea that metadata doesn't matter and that Google is a fair system.
The metadata might be provided by some neutral third party, as a matter of public record or just the accumulated weight of numerous uncorrelated data points. Here's an example from a recent link building presentation I gave: Why Links Matter to Small Businesses and How to Get Them. Great link building results come with a structured and organized approach. I think it's a great argument against being too ambitious about the possibilities of metadata (which the Semantic Web people were) but it does conclude with the thought that on-page metadata is a fundamentally good idea. It is great for the early stage of startup promotion. There are woolly intimations that self driving cars will read roadsigns to work out what the speed limit is for any stretch of road but the truth seems to be that they use the current GPS co-ordinates to access manually entered data on speedlimits.
Thiel is also the source of the "We wanted flying cars, but instead we got 140 characters" quote which has long since been memed into oblivion by its send up: "We wanted flying cars but all we got were pocket-sized black squares holding all of human knowledge". Peter Thiel writes fairly convincingly in a chapter of his book about how humans will work together with machine learning for a long time. Manually attached metadata trumps machine learning in many fields once they mature - especially in fields where progress is faster than it is in internet search engines. But "machine readable" strictly dominates machine learning. After all, why bother with the maths and machine vision when you can just write it down in an XML file? After all, with all these recaptchas I'm filling out, the machines must be getting really good at recognising palm trees. It sounds too good to be true, right?
You can live in the future right now, if you use the right mobile app as your satnav. Local directories can significantly help with local SEO backlinking by providing high-quality and location-specific backlinks to your website. This discourages webmasters from bothering with the basic things that will help people discover their pages, like OpenGraph tags or Twitter cards. Backlinks help sites to gain credibility and trust scores from search engines. Even webmasters are only given access to a very small portion of the data about their own sites to allow them to debug issues. Backlink Profile: Measuring the effectiveness of the backlinks brought into the site is one of the issues that should be given a certain degree of emphasis in the process of off-page SEO analysis. And worse yet for the data scientists, as soon as they establish the viability of doing something new with a computer, people will rush to apply metadata to make the process more reliable and explainable.
We know that follow links have been gamed "to death" in the past, so it would make sense to make that particular element a little less important in the overall mix. You feel like a an idiot suggesting something as diminutive as an XML tag when others make wild (and wildly confident) claims about what the burgeoning Strong AI will achieve. Just as they get it sorted and demonstrate its utility, McDonalds will probably just calculate and provide those routes as public information. It also misleads the public that metadata is somehow ancillary and that search engines will work all it out on their own. It's just a shame he's talking about surveillance of the public by their local council. In fact, in many cases, the local results take up the entire initial view. It can take anywhere from a few weeks to a few months. Once they're flagged, a human can always read their email anyway. The accumulated knowledge of human civilisation is still mostly in books.
No comments:
Post a Comment