Skip to content

关于技术性SEO创建的笔记 - All About technical seo

Published: at 11:50

Table of contents

Open Table of contents

关于网站的自噬性内容

This article explains the concept of duplicate content and how it affects the visibility of webpages on search engine result pages (SERPs). It introduces canonicalization as a technique used to address these challenges and provides a comprehensive overview of the process involved. The article covers common causes of duplicate content, discusses the definition of canonicalization, explores how it benefits webmasters, and examines the factors that influence how Google determines the canonical version of a page. Furthermore, it offers practical suggestions on how to optimize preferred pages to align with Google’s preferences, and presents a checklist for conducting a canonical error audit. Finally, it concludes with a discussion on implementing canonicalization using either the HTML header or HTTP header, and lists common canonical errors to avoid.

https://www.womenintechseo.com/knowledge/dealing-with-duplicate-content-canonicalization-in-detail/

被索引的被屏蔽页面

Eoghan Henn from Rebelytics shares his experience in addressing the Indexed, though blocked by robots.txt issue for the e-commerce site zamnesia.com. He explains different URL types discovered during the process and how they were dealt with individually. Additionally, he highlights the limitations of blocking URLs via robots.txt and presents alternative solutions. The key learning outcomes include understanding the implications of blocking URLs via robots.txt, recognizing alternate ways to conserve crawl resources, and managing URL indexation effectively.

https://www.rebelytics.com/fixing-indexed-though-blocked-by-robots-txt-case-study/