The short version is that Google generates a short list of results for a particular query, say around 1,000. That list is produced depending on whether the query and the material on the page are both relevant and timely. Following the generation of the list, Google applies several of its ranking signals and criteria to the shortened list. That's where “the magic” happens, according to Gary Illyes.
Google released a new audio episode in which John Mueller, Gary Illyes, Martin Splitt, and Duy Nguyen, a member of the Google search quality team, discussed how the business combats search result spam and ranks search results.
The new episode can be listened on YouTube.
Here we have all its transcript at the end of this post for your convenience.
How Can You Join Google SEO Office Hours
In real time, you can ask Google Search personnel about your website and Google Search-related questions.
What Are These Office Hours Sessions?
On their YouTube channel, Google Search Central has regular office hours where anybody may ask Google specialists questions via video call. This is your chance to pose inquiries to Google employees concerning Google Search and your website.
These office hours are open to the public and recorded, ensuring that everyone has access to the information given. Questions can be sent ahead of time, and you can ask questions during office hours if you join live.
You can join these Office Hours sessions on YouTube. The whole process of joining is explained here on the google website itself: https://developers.google.com/search/events/join-office-hours?hl=en
How Google Ranking Works
Listening to a Google representative explain how Google search works is always fascinating. Gary Illyes of Google delves further into how Google ranks its search results in this latest episode.
According to Gary Illyes, these documents are given scores or numbers, and Google “assigns a number, which we calculate using the signals that we collected during indexing plus the other signals.” “What you see in the results is effectively a reverse order based on those numbers that we assigned,” he explained.
RankBrain and even the HTTPS boost are examples of algorithms that are employed, however the HTTPS boost is just used as a tie breaker and does not really rearrange the search results, he says.
Machine learning and spam protection on Google
Duy Nguyen of Google's search quality team spoke about spam prevention methods before Google spoke about how it ranks search results. One thing he said that struck me as very interesting was that Google utilizes machine learning models to combat the most blatant spam. This shouldn't come as a surprise to anyone, but it's wonderful to see Google confirm it.
According to Google's Duy Nguyen, the company uses a “highly powerful and comprehensive machine-learning algorithm that basically got rid of the majority of the blatant spam.” According to him, this machine learning approach allows the Google search quality team to focus on “other vital work.” More essential work may be done in the areas of hacked spam, online frauds, and other challenges that machine learning algorithms miss.
Google's machine learning models have a lot of data that they use to improve their spam protection and search, and it appears that Google is highly confident in their abilities.
Listening to Google representatives discuss search is always fascinating. The way they talk about search could provide us some insight into what really counts when it comes to rankings.
As Google's Duy Nguyen stated, it's unfortunate to see SEOs concentrate on a single statistic, frequently an external metric that Google doesn't even use, rather than improving functionality, quality content, and overall user experience for their users.
Because Google has hundreds of ranking signals, focusing on just one or two is unlikely to help you rank well in Google Search. Check our sales-page of digital marketing to get the idea of these signals and different ranking methods.
Here is the transcript, as promised.
Welcome everyone to another episode of the search of the record podcast. Our plan for this series is to talk a bit about what's happening at google search how things work behind the scenes and who knows maybe have some fun along the way My name is john mueller, I'm a search advocate on the search relations team here at google in switzerland and I'm joined for this episode by Martin and Gary also on the search relations team. Our guest today is Dewey from the search quality team. Dewey would you like to introduce yourself? Briefly yes i'm so sorry Gary told me to do that thank you for having me I'm from the search quality team based in california so maybe it's just me and me being the newbie on the team What exactly is the search quality team? so i mostly focused on the low quality and spam aspect of the work but basically we have a lot of signals and a lot of new websites and pages we look at every day how do we rank the high quality results while demoting low quality or spam one so that's basically the search quality team what brought you to a team that looks at the lower quality part of the web i'm a curious person uh before johnny go i worked a lot of different roles but i also worked on marketing and search engine optimizations i've always been interested in how search works and how do you get information and make sure that people can find it how do you tell search engines that oh hey i have this piece of information can you rank it in a way that you know people can search for a query and then my results would come with the relevant search queries so i saw this job posting on google it looks interesting if you know people are doing seo and some people are doing bad stuff and spam search engine how do you guys fight against that as the google search engine of the world so i was curious and with my previous web development and seo knowledge i thought i could offer some help so yeah here i am that is amazing i think the first time i came in contact with web spam was i remember the first thing i built with php on my own website was a counter like a visitor calendar and then the second thing was a guest book and oh my god it was a simple form that was sent back to the server and then saved into a database i think in the beginning even just like a file and within days i had so many entries in my guestbook and i was like woohoo so many people commented in my guestbook and it was all spam oh man oh god those were the days i'm sorry for that martin i mean it wasn't me not really i mean i don't know gary was it you? it was probably gary yeah yeah it probably was me thanks gary thanks for ruining that website for me you're welcome anytime so annoying okay yeah so what did you end up doing martin did you add anti-spam measures did you add signals oh my god i wish i would be as smart as Dewey is back when i wrote that and i couldn't figure it out i basically just built in like a very basic version of the captcha and that all surprisingly like drove away 95 of the spammers apparently because it was no longer a super simple target but then i still got a lot of bad posts and those i just decided to turn it down oh my gosh how would you actually detect that so for such low quality or spammy content it's relatively easy if you're a person and you look at a page that's full of gibberish or in this case guest books with spammy post you should be able to say that emphatically yes this is spam you know seconds even if it's more complicated with a trained eye it should take less than a minute to determine if something is spammy or not and as google we have all these signals and all these data that we've accumulated and analyzed and studied over the years so you know it's entirely possible to collect those data to study it and build things like machine learning models to tackle spam machine learning model is interesting because it has so many use cases it recommends music for you to trust is enough to drive cars around so you don't have to drive so building machine learning models for spams turns out to be a pretty natural step for us so yeah we have so many data around not just a search result but specifically spam so we were able to build a very effective and comprehensive machine learning model that basically took care of most of the obvious spam it basically took over all the heavy lifting so we can focus on more important work so cool so if that model were to run across martin's guestbook with all of these friendly non-people visitors what would it do would it say like martin is a spammer or like how how would you treat that website so for sites that were specifically built for spam we basically demote them so they never show up for relevant queries for sites that are overrun with spam you know like give a mix of very good quality content and then part of your sites are being overrun by spam we have manual actions to help the webmasters and let them know that oh yeah this is happening you should take care of that actually we just published a blog post yesterday about this so we will send you a notification and then we would explain that portion of your site is really overrun by spam mostly this is a guest book or a forum or the comment section that you should just go in and nowadays it's relatively easy to take care of spam yeah you have a ton of measures and tools if you use cms's so cool so basically if you ran across martin's guest book you would try to figure out who this person is and send them a notification i guess finding out who the person is just based on things like search console accounts like if martin is verified in search console and he still had his guest book running and it was overrun like this and we noticed it we would say hey martin clean up your site yeah we shouldn't punish the entire site that martin's been building because one of the pages spam right that's something we really care about the good content so yeah we would try to ask martin to clean it up and help him verify that so it was a smart move that my guest book content was on a separate page so you would like kill that page never show it in search results but all the other fantastic content that i'm sure i had back in the day like lots of cat images and stuff probably has been fantastically well ranked yeah so it's you know even more important that you are verifying search consoles so if we detect that your site is hacked or part of it is being abused for spam we can let you know immediately we're pretty good with that it happened a few times to me as well actually in the past with different websites yeah that's a really nice tool i think what do you do with your website martin hey i had comments on my blog i had lots of fantastically well written content on my blog basically i know that it's fantastically well written and useful because i wrote it for myself because i forget things so like i learned a thing i figured out how to do something and then i wrote a blog post and like six months later i would google for the thing and then my blog would come up but when i had comments enabled i had the exact same problem i really could not be asked to implement some sort of spam fighting measures so i got a manual action at some point and i was like ah okay that's annoying so i disabled comments okay it's like way to throw everything out just because of a little bit of spam that's why we can't have nice things yeah so i'm kind of curious like how did you recognize it was spam did you like realize at the time that it was around seo or did it just look like why are these bots posting on my site i mean i had a tech blog and the comments were like hey you want to buy a cheap viagra and i'm like maybe they haven't actually read the article maybe they just like are not even human as do we said you get the feeling and with the trained eye i could determine that to be spam so cool yeah i don't know how cool that was really like it was not cool yeah don't do spam kids when you say trained eye do you mean that you trained yourself to spam and that's how you recognize the others i mean or um so i would say that if you just look at spam a few times and you immediately know what it is it's really not difficult oftentimes it's just gibberish or a bunch of keyword stuff together like what do spammers really think that this is 1998 we're still building web pages with front page or dreamweaver and adding 500 spammy keywords would make it do well in search engines in general i'm very curious to understand like why they think that would work but it should be very obvious so do you think a lot of this is just old scripts just being run on autopilot or how do you see that i think it's just lazy i think they just basically they want to rank a piece of content maybe it's low quality maybe it's spam but then instead of you know spending that time to write a relevant article for it building a site with good user experience with categories with tagging with all these helpful user experience they simply write a script to generate something very fast and simple and just spread around hoping that one of these would rank well in search and make them some small amount of money i mean this is very similar to the phishing and scam attacks on email because they're when you read the emails the scam emails about the narnian prince sending you an email about heritage and whatever or inheritance when you read it it's very obvious to you that maybe i shouldn't reply perhaps but then i don't know one person out of a thousand will actually reply with data that they can use to make some money out of if you have a script that's running on autopilot and spamming the web quite literally the whole web then if you make money out of i don't know with one spam comment that you left out of i don't know millions of comments there's your money and you didn't invest all that much into it yeah i think that the fact is that there's a lot of people trying to spam google actually we publish every year that you know we for the most recent figure we had was every day we discovered about 40 billion pages that are spam it's actually not a typo it's billion with a b and when users search stuff on google less than one percent of search queries would land on a spammy page so a lot of work went into detecting and removing those spammy queries so at the end of the day i think it's still possible to do that to prevent these spammy and manipulative content from reaching users and that's what we focus on every day we have enough signals and solutions to do with these now when i read that number of the number of kind of spammy pages we discover every day it really blew my mind it's it's like amazing to see how much energy must be put into just creating a giant mass of pages and it's it's really cool to see that the quality of the search results is is extremely high it's like every now and then i will run into something where i'd say oh this is pretty spammy or like this is redirecting or this is obviously someone's expired domain and they're hosting something else on it now but it's really rare that i run into these kind of things on like normal searches that i do so that's like i don't know how many gazillion people you have working on this but it's it's been working really well i think yeah that's an asian proverb i'm not sure how to translate this but it's literal translations like the more you sweep the floor the more dust you find so if you really try to find spam you may come across that but overall even in my personal day-to-day uses of google search i really don't come across spam all that much and that's what we strive for less than one percent of whatever you search for should be you know servicing some bad stuff and more than 99 of your time should go towards like high quality and relevant results yeah so what kind of things do you still find problematic on the web like where does our impact kind of come to its limits or i don't know how to frame it i would say hex spam is still a problem for the ecosystem many sites still run on older versions of cms's or you know they use outdated plugins or templates if you think about it well personally i don't know anyone that still runs windows with stuff and if you have friends that still run windows with us you probably judge them right so can we do that as the web ecosystem if people still run really outdated cms's can we help them to get on you know a version that is extremely more secure a lot of the hack spam that took place today is barely any hacking a lot of the tools and scripts that you know people discovered like five six years ago sometimes still being used today to exploit websites especially like older websites i think at the very least we should make it a lot more difficult for these spammers to hack into sites and spread spammy or malware content because when users visit your website like if they visit martin's tech blog they don't expect to walk away with ransomware or malware i think we have enough resources and cooperation in the ecosystem to make that happen i really look forward to that yeah i think that's super tricky too because i don't know how other people run their websites but for me i would put up a new blog and put some content on it because i thought like oh it's like i will spend a lot of time write a lot of stuff here and then i just never get to it and then it just keeps running and running and running and if you don't activate the automatic updates then suddenly you're running this old version you're like oh i don't really have time to deal with it and you don't even realize what is happening there and i imagine at mass like looking at like the bigger part of the web there are lots of small companies that just have their site kind of running like that where it's like oh like people can find my phone number and that's good enough and they don't realize that they're potentially causing a problem for the bigger web just by keeping things running on something that is essentially outdated i would also say that the very least they can do in those situations is to sign up for search console because then they would have more data you know where they would realize that oh yeah running this very old version of cms really hinder the site's potential maybe you know it's just a whole lot slower if you have a bunch of improvements that search consoles say you should do it's just extremely difficult so now suddenly they realize there's a lot more incentive to keep the sites up to date and obviously you're signed up with search console if we find hack or any problems we would notify you immediately now we're pretty fast and we're pretty effective at detecting hack so yeah that's the least you can do and hopefully by signing up for search console you find more incentives to keep your sites up to date do all these improvements that in the end would benefit users a lot yeah i don't know martin were you signed up for search console in the beginning or when when did that come up on your radar it didn't happen in the beginning it happened eventually when i grew the blog a little more and i noticed that like a lot of people are actually using it and it did show up a bit in search i was like maybe i should actually sign up for search console to get like a little more insights into it and i didn't even realize that that would help me with things like manual actions and and hacked issues i didn't really have that many hacked issues because it is a static site so that's nice but yeah i think it's a pretty useful tool but i doubt that every so i think like a lot of the old websites that are around are probably from people who aren't necessarily primarily concerned about their website i don't know like the bakery around the corner the cafe in a small town they want to have a website in case someone needs to discover them or wants to find out something like what's on the menu today or something like that but they don't necessarily care enough to update or know enough to update their cms and they probably also don't really are the main audience for search console so that can be a little tricky i think with a bunch of cms communities we are now working on closing that gap things like site that tries to get the key information into the cms control panel so that you don't have to go to third-party tools i think that's a pretty cool approach yeah i also moved my whole site to a static site as well partially also because of all of the hacking things i mean i had everything set up to automatically update and every week or so i'd get an email saying oh we installed another update and at some point it was like i get so many emails that my blog is updating automatically and none of my content has been updated for 10 years so it's like i might as well just make a static site out of it and also kind of protect from the hacking angle there i don't know from one of the things that i think came up to me out of all of the hack discussions and seeing how things are hacked on the web from google internally and from the help forums is that especially for smaller businesses it makes sense to kind of offload all of that and just say like use some hosted platform instead of trying to host your own website do you think that would protect against most of these hacking things or what additional things should small businesses kind of do yeah i think that would be a good solution actually we just published a number that in 2020 we sent over 140 million messages to site owners in search console that's a lot more messages than previous years right and the bulk of that was from sites that were coming onto search console for the first time so a lot of businesses because of pandemic or whatnot realize that they need better online presence so suddenly they invest a lot more into you know building the website even simple things like menu were suddenly updated a lot more frequently and now you can order online to pick up or get delivered and i noticed they also worked with a lot more hosted platforms so i think that's a good solution if you don't have your dedicated team to manage your website or social media presence you can go with the hosted platform and that probably take care of a lot of the overhead cool so what about your site gary like when are these recipes going online and what would happen if someone were to hack your recipes and change the butter for mayonnaise oh no that would probably work actually in many cases but my recipes are online actually i keep them in apple notes and they are kept in the cloud oh i mean like publicly online can we share your password with the podcast listeners i would prefer not to share my password for reasons i have stuff recipes that i would prefer not sharing at least not yet because i have to perfect them like the Doryaki that i was eating before we started recording this podcast that was uh one of those recipes which i took the original japanese recipe and i made it well different swiss cheese yeah i added swiss cheese i'm glad that you are not obsessed with cheese cool what other things do you kind of like what keeps you awake at night do we when it comes to search quality don't say the cats i learned, that's the wrong answer i would try not to the other bits that kind of keep me up at night is scam there's a lot of scams going around all sorts of scam but for example customer support scam used to be a popular one if you're looking for gmail customer support number a lot of people try to rank for that and would publish a false number to make you call them and then for some reason somehow at the end of the conversation you'd be sending them money or buying them gift cards there's a lot of youtube videos you can find about how this scam works so we did a lot of work into preventing that we were able to protect hundreds of millions of queries since 2018 that going to customer support queries we basically demoted most of the scam there but i think there's also the awareness part you know people if they're similar to very pure spam very obvious spam if you know such scam exists you probably won't fall for that a lot of time people you know click on these sites or call these weird numbers maybe because they don't know that it's a possibility that someone were out to trick them but apparently that's a common problem that government departments like the irs deal with that all the time and yeah now it may spread to other parts of the web but we're doing a lot to prevent scam from reaching users but users should also research more to protect themselves against scam it's relatively easy once you know that and once your family members know that it probably won't happen yeah i don't know for a while they were quite visible in the search results but it feels like i haven't seen them for a really long time so i don't know what you've been doing but it seems to be working pretty cool one question i always get where maybe you have some insights or some tips as well is what if like a competitor of mine or doing something kind of spammy where maybe they're just like keyword stuffing on their pages or they're creating some kind of a doorway page and i know that this is spammy because i read the webmaster guidelines and my competitor is getting away with it like they're ranking right above me what could i do there is is that something where i can report them to i don't know the spam police and they'll take care of it for me or what are the options or is it even like something where i can do anything about it yeah i would say that a lot of times uh maybe the competitors is not necessarily ranking well because they do spam there are so many factors when it comes to ranking i'm sure gary will touch on them but if you're really concerned about that you can report them to us we have a spam report that we review pretty frequently so yeah please send us a spam report you can also seek help in the support forum the webmaster help forum and then yeah we will also be able to take a look yeah so cool yeah i don't know it's i always feel a bit sorry for people who are seeing that kind of thing where they're kind of almost like stuck in a situation where they're thinking well maybe i should be spamming as well so that i can rank above my competitor who's spamming but that always feels like a bad idea yeah if everyone was doing that then where does that leave the users will they have you know good user experience and good content to consume i really don't think that's a solution i think everyone should be focusing on doing what's right and doing what's best for not just your website but for your users if you focus too much on a single metric or something that you think that would for some reason propel your site most of the time it would lead to a pretty negative outcome yeah i think it's also like you said kind of like one of those things where like you don't even know if it will actually help your site and potentially it'll just harm your site and then you're just digging a bigger hole for yourself rather than working on something positive for your website to improve things for the long run yeah an example of that that we observed was webmasters or spammers tend to focus on improving one or two particular metrics that are external that we absolutely do not use they for some reason think that if they put a lot of time and money in improving such scores it would perform really well on google search i've never seen a case where that actually worked well and i find it you know pretty sad right because if all that time and money were spent on building up the website with better user experience more functionality writing better quality content producing high quality images they probably do a lot better on search and obviously a lot more sustainable for the site itself yeah the one area where i kind of see where people i don't know use that almost in a reasonable way is when it comes to monetizing their site where like they just want some externally visible metric to go to some advertiser and say like look my site is actually pretty reasonably placed and if you spend some money with me then i can get your message out to a broader audience but it feels like sometimes i see people in the forums just saying like i want to improve this metric they don't really want to kind of focus on the site overall they're just like i just want to change this number from seven to 25. like why it doesn't change much yeah i love data myself i think you know the more data you have the better you would be at your role whatever that may be as a site owner or a online marketer i think it's really great to have a bunch of metrics that you monitor and measure and try to improve as long as you don't focus on one thing as a site owner i used to look at you know bounce rate and time spent on pages all the time for example to know which content that are really hitting it off with my audience so i can improve more or for some reason i find that nobody really discover our contact or support pages why is that do we have a problem there if people need to contact us maybe we should just put it somewhere else we write better content so yeah as long as you don't focus on one single thing because we have hundreds and hundreds of ranking signal focusing on one thing doesn't mean you will improve it across the board and we would rank your site better no yeah so many ranking signals i don't know gary what do you think which is a signal like people should just focus on like just pick one the i haven't heard of that one you haven't i'll have to google that i think it involves the meta tag if i remember correctly the meta cheese tag yeah yeah you know i just wish search engines would serve cheese instead of pages i'd be down for that that's a horrible idea like that's one of the worst ideas that you've ever had actually okay so tell us a bit about serving gary how does it actually work i was telling you about that for the past four episodes or three episodes or whatever come on you are not listening again geez yeah jeez so ranking is one of those topics where we don't want to say too much and that's on purpose but if you ever went to a information retrieval class then you heard about ranking there and the public version of it because ranking results is actually just math and figuring out first basic relevancy and then some magic that and that's the part that that we are not talking about basically that's the magic part is it's card tricks really card tricks sometimes coin tricks thanks martin that was very useful every search engine has its own kind of magic that they are using that they were brewing for years and typically the part that we don't want to talk about but other part the thing that you can hear about in computer science classes that's actually there's no reason not to talk about it because it's kind of public so once you had the query and you found a bunch of results in your index then you will have to start ranking those results that you found before presenting it them to the user and that is done i mean i'm waving my hand here using hundreds of signals but some of them are quite obvious one that we typically refer to as topicality uh basically relevancy that is based on the query and the contents of the page so for example if you are writing about cookies i don't know lemon coconut cookies then if the query contain those terms then your page will be in the retrieve set now search engines don't return all the results all the time because for some queries like for example cookies just the term cookies there might be billions of results quite literally and there's no reason to return all of them because there's no person who is ever going to go through all of them so what we need to do don't make that face because really there is no person that is going to go through billions of results so what we need to do is limit the number to something more reasonable say 1000 to 10 000 or whatever the number is um at the moment how do you do that basically you start ranking the pages you apply the things that you could collect during indexing the i don't know page rank for example again topicality and other simple quality signals that you can use already to create a reverse order list of the results that you intend to send up to the user from the index and then once you created that list the reverse ordered list then you make a cut at whatever the number of results that you can present is say a thousand and then from there on you only have to work with those thousand results but those results are not finished with yet meaning that you can still tweak those results to make them better and that's usually where the magic is and those are the signals the magic signals that we still apply on the result set to make them better for the user's query you probably heard of rankbrain for example as the listener i hope here everyone else heard about rankbrain right imagine it was like ranked playing ringplane i have heard that before yeah okay okay phew we don't have to fire you fun fun fact actually i'm not allowed on a bunch of the documents that you guys have access to just saying okay snap anyway so we still have to reorder the documents to make them more relevant to the user's query and that's where we would apply for example rankbrain and these magic signals or magic algorithms they can still make massive changes in their result set but they are only working with the thousand that we already presented to them now ranking is number based basically for each result we will assign a number and we calculate that number using the signals that we collected during indexing plus the other signals and then essentially what you see in the results is a reverse order based on those numbers that we assigned the magic signals or magic algorithms that we use like rankbrain what they do is multiply those numbers that we assign to each result by a number like for example if they want to promote a result because it was determined that that would be a better result for a lemon coconut cookie then let's say that it would multiply the results score by two basically doubling its score which means that it will jump up in the result set i have no idea why i'm gesticulating with my hand because no one can see it it looks great though it looks great at this stage thanks martin very disturbing you're doing well anyway if we wanted to remove a result from the set for whatever reason we could multiply its score with zero because then that will turn the score to zero and then like a score with zero why would you present that so just to pause you for a moment what happens if multiple pages have the same number in the end it's like someone throwing dice so that almost never happens it's highly unlikely that you would see a result that has the same score as another result if that happens then it will be just fluttering basically one jumping up and down as you refresh the page basically they will just switch positions but i think it's highly unlikely that will happen we have all these algorithms that are actually kind of designed to like cut ties like the https boost which is one of those magic algorithms that would cut this tie like if one of the pages or one of the urls is https or starts with https then that would get a tiny boost to actually propel it a little further up in the result set i actually recently got asked if the tie breakers are only applied in those situations or if they are always applied they just don't have such a large impact than others they are always applied they are actually but they are designed to not have a large impact so i can only speak for the https ranking boost in this case as a tiebreaker because i mean we created that with uh teammate xenapp anyway and there the design was or we designed it such that it will work all the time but it will not rearrange the result set unless there is a tie and then it would boost one of the results well all the results actually but it would be more visible for the results that had a tie and it actually happened quite often in some locals more often than others i still remember that in hindi india as in like the country language pair it happened 17 of the cases or something like that so it was actually pretty active in some locals so cool and less so in others okay so i guess these are like real numbers not not integers where it's like you have seven and the other one has eight i want to say these are floats okay so it's like there's a lot of room i guess what i'm saying in in the search results it's not that it's like there's this like one number jump and if you can only get that one extra point then you rank higher no these are so i was actually looking i think you sent an email about cyclone queries and then i was doing some query debugging and i was looking at the site column johnwood.com results and uh the scores were like zero point like the top score was like 0.76 or something like that that's like a bad side no it's like for the result set that was the top score it doesn't mean anything basically the scores are relative to the result set not to the whole index and then the second one would have had like 0.74 but then some one of these magic algorithms kicked in and boosted a little so it migrated one up well i got the extra http ranking boost of course it's boosted up now it's like ranking number one i still don't know which queries but http it's https oh okay i think for your site we should design an ftp ranking boost i should move to ftp no that's the article that you wrote about hacking ftp and i still remember that that's a long time ago man oh my god sure so what happens when everyone's on https do we prefer sites as surf better cheese what the cheese tie breaker yeah it's like unfortunately matt is not around anymore so i'm not sure that we could launch that but i would design that i mean we can launch cheese that's easy like with a catapult or i was about to ask like a catapult or yeah we could go up to i don't know glad dolph or something and just launch cheese down from there they're already shaped like a projectile so i think that's possible we could do a literal moonshot so this is the worst ideas ever i don't know i think google is missing out by not having me as like a product manager who's making decisions here no i think i think that's the right decision actually i mean launching cheese has actual impact that hurt no more cookies for you oh come on if you're at the receiving end that will hurt gary what did we miss what did we miss we missed cookies cookies yeah yeah and kale it's like by not going to the office you're missing out on a lot of kale like imagine all of the kale that you didn't eat because you weren't in the office and i'm much healthier because of that wait because you're not eating kale yeah like mentally okay it improves my mental health if i'm not eating kale yeah yeah i got that i just you know you do you man martin's missing the kale i see no i'm absolutely not i really am not are you missing kale or do they have kayla mountain view is that just like a swiss thing they do and uh i'm i'm sorry i'm on gary's side for this one well that's the first if people have any questions around search quality or kind of comments or ideas for search quality where should they go yeah thank you for having me you can file the spam reports we are really happy to see them we review them very frequently if you have problems or if you think you need help with something you can always go to the health forum at the webmaster help forum and yeah those feedback will also reach us so cool yeah it feels like the help forums is always a good place to go if you don't know what to do because also the experts there they can kind of guide you in the right direction where they realize that actually you should be filing a spam report or actually you should be contacting gary directly on twitter then they can point you that direction that's always a good thing to do yeah no he's way too free nowadays or just launch cheese no no just don't don't send any cheese or kale i love cheese i don't like kale oh my cool well it's been fun doing these podcast episodes and i hope you the listener have been enjoying them as well at any rate let us know how you're liking these if there are any topics that we should be including in any of the future episodes like drop us a note send us a comment on twitter or comment wherever you comment on podcasts and of course don't forget to like and subscribe so with that thank you and until next time bye everyone bye cheers