This chapter will consult the literature review about searching behaviour, search engines, Websites, and SEO.

3.1 Searching Behavior and Search Engines

Understanding the sociodemographic and behavioural characteristics of Internet users is crucial. Previous research has indicated that young and highly educated individuals use the Web for searches. On the other hand, E-travellers use various information sources to plan their trips. Prior experience, education level, occupation, and networking accessibility are some factors that can influence how users search for information on the Web. It is worth noting that users may perform searches differently, as they may lack adequate knowledge of how search engines work with queries.

Law and Huang have reported search engines as the most critical channels or factors allowing users to find travel or hotel websites. Additionally, nearly half of the respondents only look at results until the third screen of the returned lists from search engines. Interestingly, research results from another study have shown that almost 90% of Internet users find websites through search engines.

There are two primary ways to search the Web: using search engines or following the links in a specially designed directory, subject gateway, or site. Generally, users do not seem to care about how search engines work, but they are interested in the search results and how to obtain information from related websites.

When using search engines, most people view fewer than 30 results, equivalent to those displayed on three computer screens. They also tend to enter simple queries with an average of 3-4 words per query and view only a tiny number of result pages. While some users use advanced search options such as Boolean operators, modifiers, and phrase search, they may still use a different search engine. About 85% of internet users have been found to rely on search engines to locate information on the Web. However, it is essential to note that unquestioningly trusting the capabilities of search engines to retrieve data from a few keywords could lead to corresponding risks.

Users tend to be loyal to search engines like Google and Yahoo because they perceive them to deliver relevant and satisfactory search results. The popularity of Yahoo is attributed to its appearance, accessibility, brand name, and overall usability. Each search engine has different interactive interfaces, ways of searching, methods of ranking documents and displaying information, indexing models, and limitations. Most users know little about how search engines retrieve information from the Web and how they organize or prioritize the result links/websites on a displayed results page.

Search engines rank the relevance of websites by using keyword factors and the “probable relevance” scoring method. The scoring system is based on a complex scoring system comprising prominence, frequency, weight or density, proximity, and placement of keyword(s), the popularity of websites, off-page criteria (including to-and-from links), and term/theme vectors. However, search engine scoring systems or search algorithms are trade secrets never disclosed to the public. The most popular search engine sites are Google (with a 42% usage rate), Yahoo (32%), MSN (27%), and AOL (14%).

3.2 Website and Search Engine Optimization

Website optimization refers to designing web pages that impact their position and links on search engine results. Website optimization aims to help customers find their desired web pages. This process includes creating the entire website with programming, HTML coding, scripting, and finely-tuned keywords and phrases. The objective is to achieve maximum speed and a higher chance of appearing at the top of search engine results based on selected keywords or queries. Website optimization is a powerful online marketing strategy that allows users to find their desired web pages using different search keywords and questions for the best effects. SEO aims to increase search engine rankings and presence using relevant keywords, symbols, numbers, alphabets, spider idiosyncrasies, and algorithm factors with or without complicated navigational structures. Search engine placement is one of the cost-effective ways of e-marketing and promotion strategies. At the same time, SEO is an essential approach to website optimization, driving potential users to hotel websites for surfing and purchasing. The authors elaborate on SEO tools and strategies specific to e-commerce sites for effective website promotion. When a user searches a website through an optimized search engine, the entire website can attain a higher ranking position, which improves website traffic and enhances its sales capability. This necessitates using specific tools, techniques, and search engine-friendly strategies for SEO. The five methods used for the Tools of SEO category are usability tools, keyword tools, link tools, and high-quality incoming links. For the Strategies of SEO, three methods are elaborated: website structure, space strategy, and writing website titles strategy. The methods used in the Friendly Methods of SEO category are structure optimization of frames, images, URLs, directory structures, website navigation, flash optimization, and web form optimization.

Wang et al. employed the back-propagation neural network technique to optimize the search engine for speedy information retrieval from the Web. The authors claimed that using the neural network technique reduces the load of knowledge that exceeds the limits of loaded information accessible through a particular user’s requirements. The idea is to create profiles of users’ behaviours while searching for information online and then optimize the websites based on the characteristics collected through the profiling to achieve higher page rank results. SEO increases the ranking of search results in Internet marketing. Hui-ye et al. elaborated in their research that the rank of motel sites and their bandwidth increased for internet marketing after implementing SEO techniques. The authors used several SEO techniques to increase the bandwidth and ranking of search results, including text titles, label text, picture notes, HTML modifications, site maps, open website catalogues registered in DOMS, web pings, internet discussion boards, and sigkeyword signature lines services technologies have been rapidly replicated to facilitate running businesses through web-based applications. Chung and Hui state that SEO tools can help organizations like banks, governments, and other institutions improve their web services to increase their business in the current world competition.

The authors specifically emphasize using image searches, proximity organic searches, and top-k keywords for optimizing the web server. SEO tools and techniques for web server development include search-indexed web pages, optimization, selecting the correct keywords, on-site web analytics, attracting links, and off-site web analytics. Additionally, supplementary web intelligence techniques include query and page ranking factors. Focusing on trust, performance, reliability, enterprise application integration, security, and reputation ensures the quality of services. SEO is also used to rank higher in search results for business information. Yunfeng utilized SEO algorithms and techniques for the development of websites. The research canvas encompasses search engines, web design methods, and internet marketing. A search engine has become an essential part of our everyday life. Enterprises use search engines for marketing purposes since the aim of search engines is to enhance web page retrieval ability. The crucial elements of websites are created using the retrieval principle of SEO. Therefore, the ranking of websites is higher in search engines in natural search results. The research emphasizes using specific SEO algorithms and techniques, such as the PR and Hilltop algorithms. The authors stress the importance of using particular web development tactics, such as website content, links, and keyword tactics.

The relevant approach for tactics is based on developing websites that aim for SEO and achieve higher rankings in natural search results. Both corporate sectors and institutions seek to gain more recognition worldwide by having higher-ranking sites. Ahmad and Ayu have focused on the relationship between Webometric rank and web content accessibility in the search ranking system to acquire a higher ranking. The authors compare Web Content Accessibility Guidelines (WCAG) and Webometrics techniques to determine the best approach based on their ability to achieve higher rankings. After analyzing the search results, a positive correlation between WCAG and Webometrics is found. This research also investigates the correlation between WCAG and search engine rank. Since the reliability of several search engines is based on this ranking system, the authors prefer the Webometrics ranking system over the search engine ranking system. The following WCAG processes are significant: Web accessibility, Web Content Accessibility Guidelines, Web content tools, and the Webometrics ranking system. A search engine is an approach for retrieving web pages related to user requests online. Vijayalakshmi et al. focused on a filtering system for a search engine to decrease the number of irrelevant pages in search results. The suggested idea is to implement a two-tier link extractor for optimizing a new search engine filtering technique.

The search results of this web filtering system address specific issues related to searching the worldwide Web. Implementing a two-tier link extractor is based on search engine filtering to reduce irrelevant documents in the search results. The specific techniques used for the search engine filtering system include link URL filtering, assigning weight, re-ranker principles, link extractor, vector formulator, and content filter. This process retrieves specific documents related to user queries. SEO approaches are supervised in four areas: keyword optimization, content optimization, link optimization, and structure optimization. Zhang et al. focus on the impact of SEO techniques and their outcomes and analyze the efficiency of SEO to find out which approaches or methods are more effective. Online information searching is a significant activity in the present-day internet world. SEO is a method that improves the quality and traffic volume of websites in the search effects. Efficient SEO methods make websites prevalent in the results generated by the search engine.

SEO techniques aim to create a more desirable website that ranks higher in search results and attracts more visitors. The practical methods to achieve higher rankings include indexed pages, independent or static internet protocol (IP) addresses, and crawled links. In this study, the authors use six SEO techniques: link popularity, page size, web directory, customization of 404 error pages, website title length, and keyword density.

Search engine spamming is a factor that enhances the ranking of unworthy web pages and sites. Somani and Suman proposed black hat techniques to counter spamming in SEO. Black hat techniques help to identify target pages and trace down the entire graphs that cause spam. SEO makes a website user-friendly so search engines can easily search it based on related keywords. The authors highlight two types of spamming techniques: hiding and boosting. Other methods, such as spam blogs and evolving techniques, increase spamming in search engines.

The authors employed several methods to decrease spamming on search engines, including spam detection, refining requests, and combining spam detection and labelling spammed sites with PageRank. Black hat techniques reduce spamming in search engines and increase the ranking of relevant websites. Zhang et al. proposed a method for full-text search rank optimization for eCommerce. The authors used the CNBAB model (an Asian e-commerce trust platform) to validate their proposed technique. Many algorithms and protocols are used to optimize e-commerce websites, and CNBAB is a trustworthy and credible model to implement these methods in e-commerce and trading. The CNBAB trust model is based on trust measurement and trust recommendations. SEO is a technique that enhances the rank of websites and web pages. Zhu and Wu proposed a research analysis on SEO by using reverse engineering factors and built a system that automatically crawled 200,000 web pages. After that, they examined the PR, URL, and HTML-based Google search results. Website owners want their websites to rank higher than others in search results. The SEO industry has its own optimization goals, and every search engine has its crawler that refreshes the pages. The SEO factors analyzed by the authors include PR analysis, URL analysis, and HTML analysis. The authors also used a reverse engineering approach for SEO analysis. They expounded five factors of SEO: URL length, keyword that appears in the URL domain, keyword density in H1, keyword density in the title, and URL layers. The World Wide Web is a global knowledge place. Rajaram focused on web caching in semantic web techniques for developing multiple search engines by using clustering of web semantics for optimization and analyzing the advantages and disadvantages of these techniques. The web cache technique is used to reduce latency and network traffic. The authors suggest utilizing the following algorithms: least weighted usage algorithm, LUV algorithms, and lowest relative value algorithm. Web caching in the semantic Web solves the optimization problem at the architecture level. The author used web crawling, designing ontology search, and clustering web results to develop multiple search engines. Kumar and Mohan have also proposed a semantic web search engine system that uses layered architecture, which increases information retrieval accuracy using ontology-based concepts and relations. Several ranking algorithms for the semantic Web using relation-based metadata have been suggested. They mostly used page relevance criteria based on data that has to be derived from the entire knowledge base, making their application often unfeasible in huge semantic environments. The system emphasizes the information extracted from user queries on annotated resources. Relevance between queries is measured in

3.3 Summary 

The researchers of previous studies have applied SEO on different case studies of various types of websites such as e-commerce websites, CMS blogs using Joomla and WordPress, business websites, e-marketing websites, hotel websites, and classroom students blogs. They have used different SEO techniques, methods, and tools to increase the rank and traffic of their websites. Each of them has studied SEO criteria and factors from their point of view and classified them in a specific way. In this research, the researcher focuses on organizing SEO criteria and factors from their point of view. Some SEO techniques and methods have been applied to an actual CMS WP website to improve its rank and traffic on the SERP. Also, some of Google’s tools have been used to compare the results of SEO analysis before and after SEO. Table 3.1 concludes the previous studies of SEO techniques that have been used.

#Technique usedAdvantage/limitation
1  The authors have used six SEO techniques: link popularity, page size, web directory, customization of 404 error pages, website title length and keyword density.  The authors have employed the back-propagation neural networks technique to improve the speed of information retrieval from the Web on their search engine.
  2  The authors have used six SEO techniques: link popularity, page size, web directory, customization of 404 error pages, website title length and keyword density.    Development of websites   
3The website developer has implemented SEO algorithms and techniques to optimize the website’s search engine performance.  After analyzing the search results, a positive correlation was observed between WCAG and Webometrics. The authors prefer the Webometrics ranking system over the search engine ranking system.
4  The authors have used six SEO techniques: link popularity, page size, web directory, customization of 404 error pages, website title length and keyword density.  The practical techniques for obtaining a higher rank include indexed pages, an independent or static IP address, and crawled links.
5  The authors have proposed a black hat technique to counter spamming in SEO. The authors have employed the use of the back-propagation neural networks technique to improve the speed of information retrieval from the Web on their search engine.
6 The author has focused on web caching in semantic web techniques for developing multiple search engines by using clustering of web semantics for optimization and analyzing the advantages and disadvantages of these techniques.The authors have employed the back-propagation neural networks technique to improve the speed of information retrieval from the Web on their search engine.
Table 3.1: Summary of SEO Techniques Previous Studies
Picture of Hoa

Leave a Comment