Location>code7788 >text

Our website was indexed!

Popularity:300 ℃/2024-08-26 13:48:20

Hi everyone, I'm programmer Fishskin. Our team made the programmer interview brush up websiteinterviewer Less than half a month after the launch, it was included and recommended by Baidu and other major search engines!

The effect is obvious, when users search for "interview duck", the first thing they see is our own website, thus increasing traffic to the site:

On how to make the search engine faster to include the site, in fact, is a very deep learning, there is a professional term called SEO. for individual webmasters, search engine traffic is crucial, we should hope that their own site can be seen by more people, right? Leaving aside the revenue, there is a large number of visits to the site, write a resume and interviews can also blow a wave of cowhide ~ so I suggest that programmers friends better have a certain understanding of SEO.

Today's article, fish skin to their own interview duck site as an example, to share some SEO practical skills, so that all of your website can be faster by the search engine index.

You can also watch a video explanation, hot off the press:/video/BV1tz421i7Q1

 

Fish Skin SEO Dry Sharing

I. What is SEO?

SEO all known as Search Engine Optimization, that is, search engine optimization, so that the website is easier to be included and presented by the search engine, you can let more people through the Baidu, Google and other search engines to find your website, so as to enhance the website's access and visibility.

Before learning how to do SEO optimization, let's first briefly understand the process of SEO, that is: how do search engines find your website and let users search for it?

 

Second, the process of SEO

The entire SEO process can be divided into four main stages: crawling, inclusion, indexing and ranking. I'll explain these four steps in detail below.

1、Grabbing

Crawling is the first step in the SEO process, where search engines send out a bunch of crawling programs (commonly known as spiders) that crawl all over the internet, visiting various websites and crawling the content of the pages. These spiders will follow links from one page to another, traversing the entire website as far as possible.

2、Inclusion

Once the crawl is complete, the search engine analyzes the content of the page and decides whether or not to include the page in its database. It is only when the page is included that it will show up in user searches, so making sure the page is included is a key step in SEO. Some websites have a lot of links and content, but as long as the search engine spiders don't like it and don't include it, then others can't search it even if they are specifically searching against your website.

3. Indexing

Indexing is the process by which a search engine organizes and categorizes the content of the web pages it has included and creates a huge indexing library. This process is similar to tagging each web page so that when a user searches, the search engine is able to quickly find pages related to the search term.

For example, if the content of our Interview Duck website includes: Java Interview Question Bank, Front End Interview Question Bank, C++ Interview Question Bank, then these words are likely to be set as indexes. Users searching for content that includes these words are likely to search our site.

4. Ranking

Nowadays, there are so many websites, the same index is also very much, how to ensure that users prioritize searching our website? This brings us to the final step of SEO -- ranking.

When a user enters a keyword into a search engine, the search engine selects the most relevant web pages from its index library according to its algorithm andBy relevance, weight, site quality and other factors are sorted to determine which pages appear on the first few pages of search results.

This is the process of SEO, and the SEO optimization tips I'm going to share below all revolve around these processes.

 

Third, how to SEO optimization?

1、Keyword optimization

Keywords are the words that users enter into search engines. You can set up keywords and descriptions in the HTML header of your website to increase the indexing of your website and improve the ranking of your web pages in relevant searches.

Keyword selection needs to be precise and strongly related to the content of the site, to avoid excessive stacking of keywords, so as not to be determined by the search engine cheating behavior.

For example, to make an interview brush-up website, you can set the following keywords and descriptions:

<meta name="keywords" content="Programmer Interview, Java Interview Questions, Programmer Job Search, Computers">
<meta name="description" content="Programmer Interview Brush Up, come to Interview Duck, a free job interview brush up site for programmers. Massive amount of high frequency Java interview questions, help you prepare for technical interviews.">

 

2、Site structure optimization

Website structure optimization is divided into 2 points: the whole site page structure optimization and the content structure optimization of each page.

For the entire site, the level of page nesting should be as flat as possible, try to shorten the page level, in order to reduce the difficulty of crawlers crawl.

As an example, which of the following two website structures do you think is more likely to have all of its pages visited by crawlers?

The answer is self-evident; for important pages that you want to be found faster by search engines, you should shorten the path to that site as much as possible and add as many entrances to that page as appropriate.

For each page, there should be a clear hierarchical structure, and sensible heading tags can be used (e.g. first level headings)<h1> ) makes page content easier to index.

 

3、Friendship links

My university just started to do personal website, is through the way of friendship links to increase the weight of the site (although the effect is limited). The method of operation is very simple, your website to add links to other people's websites, other people's websites to add links to your website, both of your websites recommend each other, it is easier to enhance the ranking in the search engine.

The principle behind friendly links is also simple. Many search engines will rank websites according to their weight, how is the weight calculated? A very simple algorithm (Page Rank), that is, each website has its own votes, each other's website adds a jump to your website links, it is equivalent to your website to cast a vote, the number of votes of the website will be able to get a higher weight and ranking. Friendship chain is equivalent to mutual voting, than 1 vote are not the site to be good.

Of course, this mutual recommendation needs to be used with caution to avoid excessive link exchange, which may lead to weight dispersion.

 

4、Sitemap file

Sitemap A sitemap is a file that lists all the pages of your website, usually placed in the root directory of your website or by specifying its location in a file. It helps search engines understand the structure of your site more quickly and crawl the pages you want to prioritize for inclusion.

It's the equivalent of you sending the crawler a map, and the crawler is less likely to get lost or miss important pages on your site.

For websites with a simple structure, a static, fixed Sitemap is sufficient. This is shown in the figure below:

However, for websites with continuously updated content, there is a more advanced operation, which is to use the program to automatically generate a dynamic Sitemap, for example, to generate a Sitemap file for new topics added every day, so as to allow crawlers to find the latest content faster.

In addition, some search engines also support the active uploading and submission of Sitemap files, which can further shorten the time it takes for a website to be found and included.

 

5. SSR server-side rendering

Note that the SSR here is not the one we're talking about when we play the game and draw cards!

SSR server-side rendering is one of the most effective techniques for SEO. It refers to the server-side generation ofFull HTML page SSR is a method of rendering data from the front-end AJAX request to the browser and sending it directly to the browser. Compared to the traditional front-end AJAX dynamic request data rendering, SSR allows search engines to crawl the full page content more easily, thus improving SEO results.

As an example, in the case of a front-end website that requests data dynamically, the crawler may see the content of the web page that is crippled, as shown below:

This is because the browser pulls the page from the server, loads the JS script, and then sends a request to get the data.

Whereas if server-side rendering is used, the server completes the request for the data and assembles the data into the page and returns it to the front-end, and the crawler sees a more complete web page content, as shown below:

Although the server-side rendering is good, it will also increase the pressure on the server, and the development cost is usually higher. For example, our interview duck using framework development, the development process also stepped on a lot of pitfalls.

Oh yeah, it's easy to develop server-side rendered websites with PHP, which is probably one of the reasons why PHP used to be so popular.

 

6. SSG static site generation

Somewhat similar to SSR, SSG is another big killer for SEO optimization. It's the process of building a website that pre-generates all the pages of theStatic HTML files The HTML file is generated and deployed directly to the server. When the user visits the site, directly get the generated HTML file can be, compared to SSR server-side rendering, do not need the server to temporarily request data.

This approach not only greatly improves page load speed, but also allows search engines to index all pages faster and more completely. That's why many blog site generators (such as Hugo, VuePress, Hexo) package the written articles to generate static HTML before deploying them to the server.

Of course, SSG is not a silver bullet, and is suitable for websites with more fixed content and less frequent updates, such as personal blogs. A static website is essentially a cache, and if the content of the web page changes frequently, you will have to update this file frequently, and there will be no small cost.

So we can think of a more advanced strategy: a combination of SSR + SSG! Pages with relatively fixed content are generated statically, pages with changing content are rendered server-side, and pages that don't need SEO are rendered purely client-side.

 

7. Smashing rice

Note that the above approach does not guarantee absolute effectiveness, but only to increase the probability of search engine inclusion and ranking optimization, SEO strategy is the need for continuous adjustment and long time verification.

If the team does not understand SEO technical staff, and want to quickly make their own website is recommended by the search engine, then you can only "smash rice", simple and crude, is to spend money to buy advertising, so that your web page in the search results of the first few appearances. Many companies also do this, but for the individual webmaster without revenue, or honestly with the previous recommended way.

 


 

The above is the current share, in fact, there are more non-technical means of strategy, later when I figure out the understanding and then share it with you. If you have gained, remember to pay attention to the fish skin and like Oh ~

 

More Programming Learning Resources

  • Java front-end programmer must-do project practical tutorials + Bijou website

  • Free Programming Learning Exchange Community for Programmers (a must for self-study)

  • Programmer Nanny's Guide to Writing a Resume for a Job Search (Essential for Job Hunting)

  • Free Interview Brush-Up Website Tool for Programmers (Essential for Job Hunting)

  • The latest Java zero-based introductory learning route + Java tutorials

  • The latest Python zero-based introductory learning route + Python tutorials

  • The latest front-end zero basic introductory learning route + front-end tutorials

  • The latest data structures and algorithms zero-based introductory learning route + algorithm tutorials

  • The latest C++ zero-based introductory learning route, C++ tutorials

  • The latest database zero-based introductory learning route + database tutorials

  • The latest Redis zero-based introductory learning route + Redis tutorials

  • Latest Computer Basics Introductory Learning Route + Computer Basics Tutorials

  • The latest small program introductory learning route + small program development tutorials

  • Newest SQL Zero-Base Beginner Learning Route + SQL Tutorials

  • The latest Linux zero-based introductory learning route + Linux tutorials

  • The latest Git/GitHub zero basic beginner learning route + Git tutorials

  • Latest OS zero-based introductory learning route + OS tutorials

  • Latest Computer Networking Zero-Base Beginner Learning Route + Computer Networking Tutorials

  • The latest design patterns zero basic introductory learning route + design patterns tutorials

  • Latest software engineering zero-based introductory learning route + software engineering tutorials