Poster for Fabrice Canel and Jason Barnard

The Low-down on IndexNow From Mr Bingbot

Fabrice Canel is officially Principal Program Manager Crawling and Indexing at BING. He is, to all intents and purposes, Mr BingBot. Fabrice will walk us through

– what IndexNow is (and which search engines use it)

– what it means to us as webmasters, marketers and SEO professionals

– how we can help search engines help us using this new URL submission API

And (since this will be a conversation and not a press release), he’ll share a few tidbits that are surprises for us as much as you 🙂

Scheduled for 02 November 2021 at 18 H CEST (Paris)

The event is 100% free:

Book your place (free) >>

Save to your calendar >>

See on google my business >>

Set a reminder on YouTube >>

Organised by Kalicube in partnership with Wordlift.

Part of the Kalicube Tuesdays series.

Listen to the podcast episode >>

Transcript:

Jason: Hi everybody and welcome to the second part of this double bill of Kalicube Tuesdays with Fabrice Canel. You get the song. You’ve had it once before a couple of years ago in the offices of Microsoft. A quick hello and we’re good to go. Welcome to the show! Fabrice Canel!

Fabrice: Thank you. I will not give you a song. I will just speak. Thank you.

Jason: Right. And today we’re going to be talking about IndexNow. I’ve called you Mr. Bingbot. I know that’s terribly rude of me. You’re actually… What’s your official title. You are team lead for Bingbot at Microsoft.

Fabrice: I am a principal program manager at Microsoft.

I’m working at Microsoft for 24 years. A very long time. A very interesting time.

Jason: That was my question. I was looking you up on LinkedIn and you started at Microsoft in the late nineties. Did you start Bingbot?

Fabrice: Yes. In fact, we started Bingbot. I was creating the Wikipedia page on Bingbot.

So, yes. This was the evolution of MSN bot: we started by MSN search then Live Search, then Windows Live Search and then Bing, and now, Microsoft Bing.

Jason: You’ve obviously been on the team for the whole time. Have you been program lead from right at the beginning?

Fabrice: At first I was not lead, but now, I am in charge of managing part of his team to deliver the best index on the internet. Yes.

Jason: You must be the only person then who has actually been developing Crawling Bot for 24 years or more since the beginning of the “Crawling period of the internet?

Fabrice: Well, in the industry. Yes. I have seen all the evolution.

Jason: When you started, there were like 20 pages and it was really easy. And now it’s infinity.

Fabrice: Yeah. But now, this is infinity. So the scale of the internet is really growing year by year by an unknown number. This is something we don’t really know. Some people estimate that it is 60% year over year increase. So the frontier keeps increasing because people now generate content using management systems rather than notepad and they can automate the creation of websites and you see spammers and all of these people that are generating millions and billions of pages of useless content automatically every day.

Jason: And the fact that it’s growing at 60% a year, does that mean that you can’t keep up?

Fabrice: At the end, all search engines are managing a business.

We are a business-driven, obviously to satisfy end customer but we have to pick and choose. We cannot crawl everything on the internet there is an infinity number of URL out there.You have pages with calendars. You can go to next day forever. So it’s really about detecting what is the most useful to satisfy a Microsoft Bing customer.

Jason: Right. And a big chunk of this content is useless because it’s the example of a calendar bad coding (lazy coding)— it’s spam. It’s spun content. Would you say most content on the web is not useful or is that exaggerating?

Fabrice: I think it’s a little bit exaggerated. We are guided by key pages that are important on the internet and we follow links to understand what’s next.

And if we really focus on these key domains (key pages,) then this is guiding us to quality content. So the view that we have of the internet is not to go deep forever and crawl useless content. It’s obviously to keep the index fresh and comprehensive, containing all of the most relevant content on the web.

Jason: Right. And then I think that’s the key. You prefer going wide and going deep.

So if I have a site that’s at the top of the pile, you will tend to focus more on me than I’m trying to find new things that you don’t already know about.

Fabrice: It depends. If you have a site that is specialized and covers an interesting topic that customer cares about then we may obviously go deep. This is not me selecting where we go deep and not deep. Nor is it my team. This is the machine. Machine learning that is selecting to go deep or deeper based on what we feel is important for a Bing customer.

Jason: That’s kind of something we lose sight of. I’m talking to Fabrice Canel and I get the idea that you’re programming this machine to come and look at my site, but you’re not. The machine is deciding for itself.

You’re just giving the machine the goals you want it to achieve.

Fabrice: Absolutely. Yes. The main input we give the the Machine Learning algorithms is satisfying Bing customers. And so we look at various dimensions to satisfy Bing customers. Again, if you query for Facebook. You want the Facebook link at the top position. You don’t want some random blogs speaking about Facebook. That’s a good example.

Jason: Very good point. And the other thing you say there is customer, and I think that’s something we all miss. We talk about users, but in fact, as you said, you were a commercial company, same thing with Google, the search results are actually a product and we’re looking at customers or you’re looking at customers and you’re considering what’s good for my customer.

And once you say customer instead of user, I think it changes our perspective of what we’re asking you to do. Which is one of the topics for today which is going to be the main topic of IndexNow, which is all about how can you solve the crawling problem and master the internet. But before we do that, I really wanted to quickly look at your Brand SERP. This is a tradition on the show.

And for you Fabrice, I looked at Bing’s Brand SERP out of all politeness for you. So if we can show that, Anton. There you go. That’s your brand SERP, your personal brand SERP on Bing with the photo we know and love at the top and that teddy bear. Is there a story behind the Teddy bear? Is that embarrassing?

Fabrice: This is a family story. When I met my wife, I bought this bear in the US and bringing back. So it is a big Teddy bear that we have in the house and yes, this is family-related.

Jason: Brilliant. And it’s a real Teddy bear. That makes it all the more fun and exciting. And then I had to look on Google for… I didn’t have to, but I chose to look on Google for the next slide, which is I looked up IndexNow. And the reason I looked on Google was because I was curious about the fact that for the moment it’s Yandex and Bing. Google haven’t adopted this yet.

And I think that’s causing a lot of consternation in the industry. So we will have a quick word about that later on, but as you can see third result is this event, which I’m terribly pleased about. So, Kalicube Tuesdays gets third ranking on videos on Google for IndexNow. I’m pleased with our SEO work. Now on to IndexNow, can we start with what it is, how it works and why it’s helpful.

Fabrice: Great question. So let’s start by the first one. So, we have a single fundamental problem. This is that my platform and other search engines, don’t really know when Jason will post the next thing on the internet— next post, next video etc.

Jason, you know but I don’t know. So the model of crawling is really to learn, to try to figure out when things are changing. When will Jason post again? We may be able to model it. We may be able to try to figure it out, but we really don’t know. So what we are doing is we are pulling and pulling and crawling and crawling to see if something has changed.

This is a model of crawling today. We may learn from links, but at the end of the day, we go to the home page and figure it out. So this model needs to change. We need to get input from the website owner Jason and Jason can tell us via a simple API that the website content has changed, helping us to discover this change – to be informed of a change, to send the crawler and to get latest content.

That’s an overall industry shift from crawling and crawling and crawling and crawling to discover if something has changed to the website, easily notifying us if the content has changed.

Jason: Which comes back 24 years to when all of this started is that we used to submit our web pages to the search engines.

Are we coming back to that?

Fabrice: I think it was the early days and yes, you need to go to the form and submit your URL, but it was the right model. So the right model is not to have Jason perform each and every search engine to tell each and every search engine “I change something.” What we decided to do is really come up with an API layer and API interfaces to help you – you as a human, you as content management system, a CDN company (content delivery network), to tell us what has changed on your website. Ideally, it should not be the webmaster who will tell us, rather it should be underlying layer such as the content management system that tells us, the search engine, that something has changed. So we think an API layer is better than a web form because people will forget, people don’t care about all search engines. This is an API layer that is open to all search engines where websites can notify us of each and every change.

Jason: As you say, we forget, and if it’s integrated in the CMS, it just gets pinged out there.

So I just need to create my page or update my page and it immediately gets pinged by CMS— WordPress, Duda, or whatever that might be or Wix for that matter— hopefully, if they can integrate it and that’s one of the big challenges you’re having, as far as I understand it – not all the CMS are as enthusiastic as you are.

Fabrice: We are starting to engage. We have started to engage with all of the CMS company with CDN company to explain what IndexNow is all about, and having them adopt IndexNow. So in the following weeks and months, we will do a series of announcement to say that, not only more search engines are adopting IndexNow, but also content management systems and CDNs and other “dimensions”, that I cannot really speak about yet, will all be adopting IndexNow.

Jason: Before we come to the question of other search engines. CDNs —that’s something I hadn’t really considered. Basically you can sidestep the CMSs by getting the CDNs involved because an awful lot of content now goes through CDNs and if they’re submitting to you because they know it changed then the game is won, isn’t it?

Fabrice: Yes. So, one of the main CDNs, CloudFlare have already adopted IndexNow. So this is providing useful information about what changed on each and every CloudFlare site. This is an opt-in model. You have to click a button in the CloudFlare console to notify search engines (Yandex and Bing for now but more are coming) about changes on the sites.

Jason: Because I use CloudFlare and as you started to say that I thought, “Brilliant! Fabrice already knows about all my new content”, but in fact, I haven’t activated this. Why did they make it opt-in and not just push it on everybody.

Fabrice: So this is where we are reviewing with content management systems and CDN: what should be the right model? Is it an opt-in model or is it by default? So as we move forward, I do believe that most of the internet will move to a opt-out model where basically it’s built-in by default and people can opt out. Due to regulations worldwide, people for now, prefer an opt-in model but I think at some point, they will switch to an opt-out model as people are more knowledgeable and more comfortable with IndexNow.

Jason: Because there were two problems. One of which is how lazy we are and yet you now tell me, I have to go to CloudFlare and click on six buttons. That’s already tiring me out. One for each site, I would have thought. No? Is it one for the whole account or one per site?

Fabrice: This is one for the sites.

Jason: Right. Six buttons to click on and I’m lazy. And the second problem is that I didn’t know until you just told me. How can we get that message out other than doing shows like this?

Fabrice: So this was part of the blog post announcement, Also CloudFlare has communicated too. We get visibility because we have lots of URLs coming in on this new system and a lot of sites have already adopted.

So again, this is benefiting from Jason’s show and other communication to get the message out and to tell people to click here, if you have CloudFlare sites to enable this feature.

Jason: We’ve got about 50 people watching. So if those 50 people each control let’s say, six sites, that’s 300. I would imagine that CloudFlare has got a few more than that. You said a lot, how much is a lot?

Fabrice: So, again, there are some sites on the internet that track a usage of CDN in website, W3 Tech is one of these such sites and they estimate about 20% of the internet is on CloudFlare today. So this is a huge volume of URLs coming in, I cannot disclose the number of URLs coming in, but yes, we are very pleased with what CloudFlare has done as early adopter of IndexNow and clearly webmasters adopting IndexNow via CloudFlare are happy customers. But the is more than just CloudFlare for CDN. Akami has also disclosed that they are looking at adopting IndexNow and there are over CDN that are coming, that we’ll announce later on.

Jason: Right. So it looks to me Fabrice, you and your idea of IndexNow, because if I understand correctly, a lot of this came from you personally.

You’re solving the problem of crawling. Is that what you’re looking at?

Fabrice: Yeah. At the end of the day, this is really solving your problem, Jason and the problem of each and every website. Crawling is a dumb thing to do. BUT, if you tell me what has changed, my crawler will be super smart and super intelligent and it will come to pick up a latest content instead of again, pulling and pulling to try to get the latest content from you.

You know when the content has changed but I don’t know. Please tell me when the content has changed. Why wait for the search engine crawler to come in the next hour or next day before picking up your latest content. Tell search engines when the content has changed. As easy as that.

Jason: And what strikes me is you’re not actually asking me, or I’m not even telling you. In fact, you’re getting the CMS and the CDNs to do the work for me.

So I’m going to be informing you without even knowing it, which is really rather sneaky and smart. Brilliant.

Now the big question is Google hasn’t adopted it yet. What’s happening on that front?

Fabrice: I think you should ask them. You should do your journalist work. This is an open protocol, so the story is when you ping one search engine, we will be reaping behind the scenes. So all search engines will be aware of this notification. And the overall benefit is being indexed in ALL search engines, being fresher, having the latest content indexed in all search engine. But we’re going to think is really about minimizing the crawl load on each and every site.

Instead of having a lot of crawls done by each and every search engine, if the site tells us that the content has changed, then we can be smart about going and picking up, selecting, and crawling the latest content… just once ideally.

Jason: What I hear from that (is Google will adopt) is what I’m interpreting it. I’m not saying that’s what you said!

But Google would be foolish not to because A: it saves them money on the crawl costs and B: if they’re don’t, they’re going to have a less fresh index than you.

Fabrice: So today (Index Now) benefits us a lot. As you said, various dimensions that we are tackling. One is top sites such as eBay, for instance, has adopted and LinkedIn, has adopted GitHub, has adopted… lots of major sites have adopted the IndexNow system.

Knowing what changed on eBay? Good luck with that. We don’t know. Yes. We can figure it out from links, but if you post something on eBay, the best way for us to know is by providing this kind of URL a notification that something has changed. In the end this is helping us to have the latest content less dead links and  a more useful index at Bing for eBay. Obviously this is not only satisfying bing customers, this is obviously satisfying also eBay and its customers. EBay customers get more good quality traffic coming from Bing and from Yandex.

So point number one is the search engines, then content measurement systems, CDNs (and large sites) .

These are the bases we are covering but in the end we want to cover the whole internet. Actually, maybe not the whole internet: maybe 20% of the internet will never adopt IndexNow, but that is fine because on 80% of the internet, we will be fresh and minimize crawl load on all of these websites.

Jason: We’ve got the 80 /20 rule here.

Before this conversation, I would have said “80%, that’s insane, that’s not possible”, but now you’ve listed it out, I’m beginning to believe you. This is possible, isn’t it?

Fabrice: Yeah. This is possible and we do believe this is possible. Not within the next month obviously, but it will take one or two years to get there.

With major content management systems, major CDN, such as CloudFlare and Akami,  plus a few others who are coming…  Plus with top sites such as, eBay and many other very large websites, then you cover a REALLY big part of the internet.

And the question about WordPress: do you think this could get integrated into the core of WordPress or are you going to be relying on plugins like Yoast?

So we do believe that there is interest and there is a publish thread open with WordPress, where I disclosed this feature request relative to WordPress. We have offered some help in terms of coding because it’s open source.

We have offered some example codes that may be able to support IndexNow and there is a thread open on this topic. Again, the value proposition for all WordPress users is that there is a lot of WordPress sites out there: 40% of the internet is on WordPress.

Jason: Anton was sharing earlier on that it’s 42%, I think, is WordPress, which is insane. And I thought it was about 30%, but that’s mad.

Fabrice: Yeah. Depends on how you count, but whatever, this as a big and, this is challenging for us. A lot of blog posts are almost never updated. They are updated once or twice per month or even per year. We don’t really know, and we have to crawl a lot. And we cannot crawl each and every WordPress site daily or hourly.

So think of the big benefit again: if you post something, you care and you .. you care about having this content indexed quickly in search engines. One benefit is less crawl load, but often it is small sites, so crawl load is not really a big deal… the big deal is really about getting the content indexed quickly. You don’t have to wait one day or even five days to get the content indexed in search engines.

Jason: My frustration is always when I update something and I can’t be bothered to go in and submit it by hand to Bing and to Google. And then I think, four or five days later, why haven’t you found it? And the answer is…

Fabrice: No IndexNow! Because I can monitor Jason’s page every five seconds with my crawler, but you won’t like me.

We should be smarter than this. We should know when you fix a typo, when you fix something on your site, when you add a new page. It’s easy if you tell us about that change, and then search engine will come to get the latest content (hopefully only once) … and we are done and you are happy, my customer’s happy. Everybody’s happy.

Jason: And you’re saving me effort and bandwidth and server load, but for a lot of small sites that simply doesn’t matter.

And for a lot of small sites, they’re saying when I hardly ever changed my content or I’ve got very few pages, why would I care?

Fabrice: Freshness means getting quickly the latest content or the latest fix that you want to do on your site quickly reflected in search engines. I think this is value proposition number one.

Second is that as more search engines adopt IndexNow, search engines, the sum of all search engines worldwide will start to make a lot of impact on your website. So with IndexNow, we should become smarter at crawling less often, and the total volume of crawl should diminish.

Jason: Basically, you’ll also (now I hadn’t really thought about this) put push all of this into your machine learning, so the crawler can then learn by itself the patterns of how things change and therefore reduce its own crawling footprint! So the actual crawling that goes on behind to get the same result.

And that leads me to the next question, which is, does the fact of using IndexNow then reduce the crawl of my website? Do I then penalize myself and lose crawl?

Fabrice: As sites are adopting IndexNow, the machine learning that we have behind Bing is learning about, “Oh! this site has IndexNow enabled, great”.

So that’s easy to detect because we know that you are publishing through IndexNow. What is a little bit harder to understand “Is it a useful notification?” Meaning, are you notifying us only for part of the site of the whole site? Because you may be missing some folders, you may have an application thing that is generated from another content management system, or you are making some mistakes where you do not notify us about a specific page and so on and so on.

So this learning is ongoing and as we go, the machine learning of crawling will become more intelligent and crawl less than we used to do on sites that are adopting IndexNow.

Jason: And sticking to that point, for the moment Yoast, who obviously are massive in the WordPress world.

And that’s why I was asking about the core, because they’re saying for the moment they’re not adopting it because as far as they are concerned, the smaller sites don’t really need it as I understand it. And because Google hasn’t adopted it yet, they don’t really see the point. And do you feel that underestimating a) the importance of IndexNow and b) that Bing, even though it’s smaller than Google is still 8 to 10% of organic website visits?

Fabrice: Yes. But what people often forget about Bing is that Bing has a market share, that is not zero. But there are also a lot of companies leveraging Bing. So if we sum all of this, yes we are, as you said, in the mobile plus desktop in the 8%- 10% range. If you look at desktop, then this is even higher. So, there is obviously overall, benefit for Bing. And also we are leveraging Bing today to power search across Microsoft.

And again, the value is maybe not on reducing crawl because today only Yandex and Bing have adopted IndexNow (more are coming) but it’s also about freshness, getting the content quickly indexed. So if you care about satisfying your customer, getting the latest post quickly indexed in search engines. Yeah. This is a way to adopt.

So we will see Yoast, maybe at some point, adopting. We are speaking to them. We will potentially also see other plugins adopting IndexNow in WordPress. There are discussions ongoing. And at some point WordPress will adopt IndexNow,

I am totally convinced that the whole industry will adopt.

Jason: What I’m hearing is you’re really determined, And you’re really going for this Fabrice.

You’re not going to let this lie until everybody takes this up because you’re so convinced that this is important, both for crawling and for the environment.

Fabrice: Absolutely. I am convinced that this is a right “fight” for the whole industry, for the internet. This is about notifying what changed on the internet, helping the latest content to be indexed, helping to remove crawl load on websites. This is clearly a great value proposition.

We hear these complaints for years about “I didn’t get my latest content indexed” or “you are crawling me too much”. This is the case for all search engines. This is also the case for Google. Often people welcome Google because they send customers but they still crawl a lot. So if we can remove overall crawl volume on all sites, everybody will appreciate, and this will help the whole industry.

My teenagers want to see the latest content indexed. They don’t care about content from five days ago. They want the latest content. This is all about letting the latest content being reflected in search engines. I think this is making a difference for the whole internet.

Jason: Talking about the latest content – a little off topic but Facebook renaming themselves to Meta is interesting. I actually did a study of their Brand SERP to try and figure out how they’d managed the rebranding. Turns out that they bought a company called Meta five years ago. So they’ve been planning this for years… but what struck me is how fast both Bing and Google updated the content to reflect it in Meta’s Brand SERP.

Within hours of Mark Zuckerberg mentioning it, the new content representing Meta as the new Facebook was up there. That freshness is really difficult to do at scale.

Fabrice: Yeah. When there is a lot of content talking about something, then search engines will pick up that “noise” and some of the content will quickly be in top position.

But what people forget is that there is a lot of valuable content that is not picked up by search engines. When such major things that are happening, we will get CNN and other sites and this content will surface… but think about the school, your school. Maybe they post the latest calendar events and maybe search engines will miss that for weeks, for month! But here you have the ability to really quickly tell search engines, “Hey, I updated the school thing. There is a COVID restriction and that is changing the schedule.” And so you the latest agenda quickly reflected in search engines.

Jason: You’ve just mentioned COVID. Is it something that pushed you forward with this because COVID made it that we needed to get the up-to-date information about school closures, about company closures, about shops that weren’t open, about lockdowns, or were you already planning this? You’ve been planning this for four or five years.

Fabrice: We were planning this for a long time. Bing’s submission API that we have, that was proprietary to Bing was a first instance of really learning at scale how to manage, this kind of notification from top sites and we’ve benefited a lot. So the view was “Okay, we have this API, let’s scale it to support the whole internet and offer it as an open protocol to all search engines and to the whole community, to help everybody to get the latest content indexed at Bing. It was a proprietary solution to learn. And now we are opening up and extending it to the whole internet.

Jason: I love the term to learn. And a lot of this is we do something. We learn, we get the machine learning, working…. then we’ve got an idea of how it’s going to function, and then we can release it. One question that a lot of people are asking if I ping IndexNow, does that guarantee indexing?

Fabrice: That would be too easy for spammers to get their content indexed. So obviously this is just a hint that the content has changed and we will take own decision about crawling or not crawling the content. You may notify us every second that your homepage has changed and this is maybe not true. You are not a human if you update your homepage every second 🙂 So we have some basic heuristic, and also advanced machine learning to really understand if this is useful content or not, and take that into account. But it’s clearly a good hint for quality sites to get their latest content indexed.

But if you are eBay or other large site, then this helps with tail pages, something that is not so “popular”. In that case, we will more likely take this link into account, versus some other links that may pop up on the internet. Why? Because we know that eBay is not a human publishing these links, they are publishing these links from a content management system. So that means in general, high quality links coming in.

Jason: Which then leads me to the next question, which is what are the limits that will be placed on websites? Is there an absolute limit or is it all going to be machine driven to understand what a site would be expected to submit and when they are likely to publish things?

Fabrice: Yeah, obviously we don’t accept 1 billion URLs from Jason’s site, every one second. It’s not very reasonable scale. But we are able to support 10 thousand URL in terms of post with some QPS. So if you post through QPS, this is something that has already billions of URL per day per site that we are allowed to support. So it’s a large scale. This is not about one URL or even, (and on WordPress sites it will be one or two URL) every… few day. But on large sites the volume is far more than this – we are looking at millions of URL every day on some sites.

Jason: One question that people have been asking is when will I get caught for abuse? And the answer is it’s going to depend on your site. Is it site level or source level?

So with a CDN like CloudFlare, for example, be carrying an overall kind of abuse policy or would it be site-based and you would look at each domain?

Fabrice: We look at each domain. We look at everything. We cannot tell you exactly what we are doing, because this will be an open door for spammers but it’s still following the Bing Webmaster guidelines about what we do and what we do not do per site.

For example, what we consider is not allowed and what we consider is allowed to be indexed. So if your site is not indexed and you think you have a valid reason to be indexed, feel free to open a support ticket. We DO look at them, to review why the site is not indexed and, if you need to be indexed and we are not doing a great job at that, then we will fix the problem.

Jason: Does this now mean that I can forget about my XML sitemap and just ping IndexNow?

Fabrice: Excellent question.

XML sitemaps are a great way to tell search engines on a daily basis what is listed on your site. It is a list of URL that you refresh it daily in general. Why daily? Because search engines will try (well, at least Bing) to visit your sitemaps once a day, to collect all these URLs.

So it’s a good way to tell search engines on a daily basis about all of the URLs on your site.

What about IndexNow? IndexNow is all about knowing “now”. You should not have to wait one day, two days to discover the URLs from the sitemaps. We should know now what changed on the site. If you fix a link, if you delete a link, if you update a link, please tell us via IndexNow.

Unfortunately, delete scenario is not well handled via sitemaps – it is hard for search engines to understand what is deleted from sitemaps. IndexNow provides a clear signal, about what changed on the sites. We may miss a ping, we may have a system issue that prevents us from pinging one time. That’s fine. You have still a backup via sitemaps.

And so this is why the two are important. To get the latest content via IndexNow, and once a day, we will verify, and we’ll be aware of all the URL on your site via your sitemap..

Jason: Brilliant stuff. We’ve covered pretty much all of the topic, unless there’s something I’ve forgotten in which case, tell me now…

Fabrice: NOW – it’s all about “now”.

It’s a very easy API to use. Please look at the documentation.

Jason: I was looking at that. It’s so easy even I can understand it and I’m rubbish at coding. So I do support that. That it is incredibly simple, so it’s incredibly simple.

Fabrice: Yeah, it’s incredibly simple to use because simple ensured that it will work for the next whatever number of years, 50 years. It’s a simple GET URL that you can type in your browser, to notify search engines (all search engines adopting IndexNow at least) of changes on the internet. So clearly something that will be around for a very long time, just like sitemaps and over protocols we established between search engines. And this, I think, will help the whole industry, to get the latest content indexed and remove crawl load at scale on the internet. So we do this for the good of the internet.

We do this also to help people, to help us of obviously, but to help the whole internet, getting their content reflected in search engines. And I think it’s a good task to do for the good of the internet. Yes.

Jason: Wonderful stuff. And then the last question is, you’ve just said IndexNow is incredibly simple.

It’s going to be around for the next X years, maybe 50 years. Does that mean that you think you’ve already nailed it? Will the functionality evolve over the next few years or is it pretty much fixed like this with a ping to create, ping to update and ping to delete?

Fabrice: Yeah, it’s a simple ping. There is no notification about update or delete or anything that you need to add. It’s just a simple base URL has changed and search engines will figure it out. Maybe search engines do not have the first URL at the first attempt, so maybe they don’t know that it’s an update for them. It will be a new URL. Or maybe they are not aware at all and so we will delete, and we will say “This is a dead link and that’s fine – I don’t need to index it”. So again, we do believe in this API. We are listening for feedback, but we think it’s a well established protocol.

So far, Yandex and Bing have adopted. This IndexNow protocol has been reviewed by many other search engines, and so we collected feedback from many other search engines before announcing it. So we do believe that from the learning we got from the Bing Webmaster Tool API, and from this broad discussion within the industry about the IndexNow protocol, plus the discussions between search engines… We think that this protocol will be lasting and will not change for a very long time.

Jason: So the conclusion that I just heard that basically is whether we (the community) like it or not, this is coming and it’s going to be here. If you’ve got anything to do with at least.

It is helpful, it is useful, it does help keep the content fresh and however much we might want to resist it, it’s going to happen anyway.

Fabrice: Yes, it will. Again, with CloudFlare (20% of the internet) , plus content management systems.

You will see an announcements over the following weeks and following months, that extend the scope of IndexNow.

And maybe a “large search engine” will adopt someday and we will all contribute to the whole internet in terms of quality and helping everybody to be indexed all search engine. Yes.

Jason: Brilliant. Well, I’m going to stick my neck out and say, “I bet my bottom Euro, that Google will adopt in the next few months”.

And then the whole thing will explode and we’ll all be terribly happy. And now thank you very much Fabrice. So that was absolutely brilliant. That last comment was my own personal point of view. Nothing to do with anything Fabrice said, We’re going to announce next week. We’re going to do two baton changes this week. We just had Mads giving the baton to Luke. And now I’m going to announce once again, Luke striking the right balance between SEO and CRO. Once again, I met him in Brighton. He’s delightful, cheerful, intelligent, informative, and enthusiastic. What more can you ask for? So Fabrice, could you pass the baton to Luke as well?

Fabrice: Yeah. I will ask Luke if he plans to use IndexNow in e-commerce websites. Clearly, eBay and many others are adopting IndexNow. We will see an announcement potentially in a Content Management System related to e-commerce site, that will be adopting IndexNow. So we do believe that the dimension of e-commerce which is to get traffic on e-commerce sites is to get your latest content indexed.

If you have a shopping product, if you sell something and do not get it indexed that isn’t good. If you price changes… if you think about Cyber Monday coming in the US and few other markets… Then getting the latest content indexed everywhere matters. So please, use IndexNow for e-commerce sites.

Jason: Absolutely brilliant. What I love about that is I asked you what big point I’d missed. And it was that one and you were saving it up for passing the baton. Absolute genius, Fabrice. Thank you very much. A quick goodbye to end the show. Thank you, Fabrice and everyone who watched. 

Fabrice: Thank you Jason.

Similar Posts