The term SEO scares many people however it is not such a complex issue. There are two types of SEO’s one is the on-page SEO, which requires a lot of articulate, detailed content manipulation and change. The other is Technical SEO which is the relatively easy bit. Technical SEO includes following some very simple best practices, here we go:
CHECK THE INDEXABILITY OF YOUR ROBOT.TXT
Robot.txt is like your website house manager. This is where you put rules, you tell the search engine crawlers what to do like not to visit certain places “disallow” or how long they should wait before visiting your office “crawl delay”.
Confirm that it actually exists on your website, type robot.txt at the end of your website I.e https//:yourwebsite.com/robots.txt Make sure to include the site map at the last line of your robots.txt
INSPECT YOUR WEBSITE STRUCTURE:
Your website should be in a structure readable to people and the search engine here is how:
Clean: Let your URL’s be clean and readable.
Secure: make sure that all Url’s are secured a Secure Socket layer, use HTTPs not Http
Ensure your website’s metadata speaks for you
Make sure all images have alt texts, This will help the bots be able to read through what you have
This informs the Search engine of the content of the website, make sure you have at most one per page
Google chrome browser has a tool called lighthouse, it may give you an idea of how you are performing on your websites best practices.