PHP пример парсинга URL для «User Friendly URLs» PHP


Оглавление (нажмите, чтобы открыть):

PHP пример парсинга URL для «User Friendly URLs» PHP

24 PHP Packages found for url friendly | Latest url friendly packages | RSS

jaybizzle/safeurl

A Laravel package to create safe, SEO friendly urls

diarmu >PHP library to generate short, non-sequential, URL-friendly hashes of incremental >

  • 28453
  • 1
  • 12
  • 2

pishran/persian-slug

Generate a URL friendly slug from a given string.

alexeymezenin/laravel-russian-slugs

Russian SEO friendly slugs for Laravel 5

mikefrancis/tokenizr

PHP library to generate unique, URL friendly tokens

wispx/slug

Friendly, SEO friendly url structure.

expstudio/friendly-url

Easy creation of slugs or friendly url for your Eloquent models in Laravel 4 with PHP5.3.x and non latin character support.

ballen/sluginator

A simple URL slug creation library for PHP, simply feed it a string (such as a blog title) and it will create a URL friendly slug.

carc1n0gen/short-link

package for generating short url friendly strings from integer ids

maks757/yii2-friendly-url

Friendly url for your library or module

nercury/object-router

Allows to create and manage friendly routes for objects. Requires doctrine-orm to store routes.

pavlinter/yii2-url-manager

Yii2: UrlManager create friendly URL

magroski/permalink

A library to create url friendly slugs

ink/ink-str

Generate a URL friendly «slug» from a given string. (Skips ASCII)

terrylinooo/seo-search-permalink

Change default search URLs to the SEO friendly URLs. It may improve your SERP to boost your site traffic. The default URL ?s=keyword will be changed to /search/keyword, and you could change the for your needs.

theprivateer/permalink

Simple utility for creating URL friendly ‘slugs’

sergiocardoso/phpicture

Generate friendly thumbs from url with parameters


balloondev/phurl

Url friendly engine

yiiforces/yii2-pjax-url-behavior

Behavior to generate or overload the header X-PJAX-URL in a get request for pjax

badian/translit

The plugin for translitirating cyrlic string into latin. May be need for creating friendly url.

troquatte/url-rotas-amigaveis-simples-para-pequenos-sites-em-php

Simple url generator friendly and routes

xz1mefx/yii2-ufu

User friendly URLs tools package

comradepashka/yii2-seokit

Set of tools that helps to construct SEO-friendly site

softhem/yii2langswitcher

Yii 2 language switcher with SEO friendly url

Ruby on Rails: user friendly URLs

В данной статье будет показан пример как сделать красивые ссылки в Rails проекте. Ссылки вида /posts/1/ будут преобразованы в /posts/1-article-name/

Подготовка

Начнем с того что поставим рельсы последней версии, выполнив в консоли gem install rails -v=3.1.3
После окончания процесса установки джема, создаем новый проект командой rails new nice_urls . В результате у нас новый чистый проект со всеми установленными джемами, благодаря тому, что в конце генерации проекта автоматически был запущен bundler.

Создание постов

Для демонстрации нам подойдет обычный скафолдинг. Давайте сгенерируем его для статей у которых будет название и текст:
rails g scaffold Post title:string text:text
Для внесения изменений в БД, выполним миграцию командой rake db:migrate . Теперь можно запускать сервер (команда rails s ) и смотреть на то, что у нас сейчас есть по адресу localhost:3000/posts
Перед нами появится привычный интерфейс добавления постов:

User Friendly URL — rule template

Rule templates are used to provide a simple way of creating one or more rewrite rules for a certain scenario. URL rewriter module includes several rule templates for some common usage scenarios. In addition to that URL rewrite module UI provides a framework for plugging in custom rule templates. This walkthrough will guide you through how to use «User Friendly URL» rule template that is included with URL rewrite module.

Prerequisites

This walkthrough requires the following prerequisites:

  1. IIS 7.0 or above with ASP.NET role service enabled;
  2. URL rewrite module 2.0 release installed.

Setting up a test web page

We will be using a simple test asp.net page to verify that the rules created by the template work correctly. The test page simply reads the web server variables and outputs their values in browser.

Copy the following ASP.NET code and put it in the %SystemDrive%\inetpub\wwwroot\ folder in a file called article.aspx:

After copying this file, browse to http://localhost/article.aspx and check that the page was rendered correctly in a browser.

Using rule template to generate rewrite rules

The «User Friendly URL» rule template can be used to generate rewrite, redirect and outbound rules that make URLs for your dynamic web application more user and search engine friendly. Typically, dynamic web pages take into account query string parameters when generating an output HTML. The URLs with query strings (e.g. http://contoso.com/articles.aspx?year=2008&month=11 ) are not as easy for humans to use and communicate as simple hierarchy based URLs (e.g. http://contolso.com/articles/2008/11 ). In addition some search engine crawlers may ignore the query string when indexing the web site pages. The rule template helps you generate rewrite rule that transform the hierarchy based URLs to URLs with query strings. The templace can also, optionally, generate a redirect rule that can be used to redirect web clients form URLs with query strings to clean URLs. Finally, it is possible to create an outbound rewrite rule that replaces all the occurances of URLs with query strings in the HTML responce with their hierarch based URL equivalents.

To use the template follow these steps:

  1. Go to IIS Manager
  2. Select «Default Web Site»
  3. In the Feature View click «URL Rewrite»
  4. In the «Actions» pane on right hand side click on «Add rules…» and then select «User Friendly URL» template:
  5. In the «Add rules to enable user friendly URLs» dialog enter an example of an URL with query string parameters: http://localhost/article.aspx? > and then expand the drop down list with suggested options for how that example URL can be transformed into a URL without query string.
  6. Choose the second option: http://localhost/article/123/some-title . Notice that the URL pattern and Substitution URL have been updated accordingly. These will be used in the rewrite rule that will be created by the rule template. Check the «Create corresponding redirect rule» to create a redirect rule that will be used, when web clients used internal URL to request a web page. Those clients will be redirected to a corresponding public URL.
    Also, check the «Create corresponding outbound rewrite rule» to create an outbound rule that will replace all instances of internal URLs in the response HTML with their public equivalents.
  7. Click «OK» so that the rewrite, redirect and outbound rules will be generated:

Testing the rule


To test the generated rules, open a Web browser and request the following URL:

You should see that the rewrite rule on web server has changed the original URL to article.aspx and it has passed «234» and «some-title» as values for query string parameters.

In the web broser move the mouse over othe «Link URL» text (or use menu «Page» -> «View Source. «). Notice that even though the URL for the link was originally in a format that used query string parameters, it has been replaced by URL Rewrite Module to use hierarchy based URL format.

Also, if you request http://localhost/article.aspx? > you will see that the browser gets redirected to http://localhost/article/432/some-other-title .

Summary

In this walkthrough you have learned how to generate rewrite rules by using «User Friendly URL» rule template included in URL rewrite module. This rule template can be used as a starting point for designing rewrite rules for enabling user friendly and search engine friendly URLs for you existing web applications.

Make your URLs SEO-friendly

One of the useful functions of the .htaccess file is the ability to change URLs. You can use it to redirect one URL to another or use it to make a broad sweeping change to all URLs on your website. A facet of this ability was demonstrated in the last section.

In this chapter, we will look at how to use the rewrite rules to fix common an SEO problem. The problem we will be fixing will be non-SEO-friendly URLs. SEO-friendly URLs are user and machine readable with a logical hierarchy.

Understanding SEO-friendly URLs.

SEO-friendly URLs are fundamentally URLs that use relevant, common language, and semantically significant words in the domain, path, and page. What does that mean? It means that people who look at the URL should be able to read it and understand what should be on the page it serves.

Example: SEO-Friendly URLs

This URL is NOT SEO-friendly.

This URL is SEO-friendly.

If this seems complicated, you can follow several basic rules that will help. Those rules are as follows:

  • Use a single domain.
  • Only use a subdomain if it is absolutely necessary.
  • Use plain words for the folder/paths and page names.
  • Use keywords in the URL. Yes, just using words is not the same as using descriptive, relevant words.
  • Shorter URLs are better.
  • Keep URLs similar to page titles.
  • Keep special characters out of the URL as much as possible.
  • Avoid having too many folders.
  • Try to stick to all lowercase URLs.
  • Separate words with hyphens.

If you follow these rules, your URLs will be much more readable.

SEO-friendly URLs via .htaccess.

Now, that we have covered what SEO-friendly URLs are, we can look at how to use the .htaccess file to achieve them.

Removing file extensions

One of the problems with many websites is that the file extensions are cluttering the URL. This is not necessarily bad from a technical SEO perspective. However, removing them can clean up the URLs and improve the user experience.

The following code will change example.com/cool-page.html to example.com/cool-page .

You can do the same thing with example.com/cool-page.php by changing each instance of .html to .php in the previous snippet.

Redirect index.php to the root

It is common to have an index.php file as part of most PHP based CMSs. It does not look good and is bad for the user experience. This is a simple fix.

I have seen this cause an endless redirect loop. If that happens on your server, you should try this method.

Making URLs lowercase

Some Linux servers don’t handle uppercase characters in URLs very well. The server will often return a 502 Bad Gateway response header. In my experience, this is more common with non-physical URL paths. In other words, if example.com/images/ is not a part of the physical file system on your web server it is more likely to return an error code.

Мастер Йода рекомендует:  Курс «Основы Kotlin»

Google’s John Mueller responded to a question about uppercase characters in URLs.

URLs are case-sensitive, but pick whatever case you want.

John Mueller tells us that Google does not care what case you use. However, URLs are case-sensitive. This means that example.com/images/ is not the same as example.com/IMAGES/ . This can cause a duplicate content issue.

I personally recommend using lowercase for everything on your web server and website. This allows for the simplest correction of this issue.

Note: In most cases, a canonical tag is enough to correct this issue. However, this does not affect images and other non-HTML documents.

To solve this issue, we need to first determine if there are capital letters. If there are we will redirect to the same URL with all lowercase characters.


For the sake of space, I will not discuss each element of this rule. However, please note that we are storing the variable HASCAPS via the [E] environment variable flag. This allows us to check against it later in our rewrite rule.

I have found the [DPI] flag is not always needed. However, if it is not included on some server configurations it can return a 502 error.

There are a few other ways to force lowercase URLs. You can use mod_speling. However, this module should set to only correct the URL case, if you have enabled WebDAV on any of your subdirectories. You can read more about mod_speling in the Apache documentation. (Yes. that’s mod_speling not mod_spelling.)

If you have access to the httpd.conf file or your host provider will make changes for you, there is another way to do force lowercase URLs with .htaccess.

In this method, we will take advantage of a mod_rewrite directive that seems to be unknown or overlooked by many web developers. The directive we will be using is a RewriteMap . A RewriteMap allows for predefined substitution strings that can be used in a RewriteRule . This is an invaluable tool. The trouble for most webmasters is that the RewriteMap directive cannot be used in sections or in .htaccess files. However, you can set it in the server context via the httpd.conf file.

To force lowercase URLs with a RewriteMap we will set the following directive in the httpd.conf file.

After that, we will set a RewriteRule in our .htaccess file.

This is so small and pretty! It makes me happy!

Please note that the lc is the map name, and it is arbitrary. However, it is vital that the map name matches in both the RewriteMap and RewriteRule directives.

The int: in the RewriteMap directive defines an internal map source. tolower is a default internal functions that ships with Apache. It converts the key to lowercase characters.

Warning:

Some image tools save images with uppercase file extensions by default e.g. .PNG not .png . This can cause these images not to work in some instances. Ensure your image file extensions are lowercase before forcing lowercase URLs on an existing website.

Dynamic URL rewriting

URLs that are created dynamically (i.e. have no correspondence to the server physical file structure) often are not very readable. Creating a rewrite rule to «beautify» the URLs on your site can be very helpful.

If your URL looks like this example.com/blog.php? > it is anything but SEO-friendly. We can fix this with a little help from the .htaccess file.

It will now look like this example.com/epic-blog-post . That looks a lot better.

.htaccess is not a fix-it file

If your URLs are anything but SEO-friendly trying to fix them with the .htaccess file is not wise. It would be difficult to create redirect rules that logically rewrite URLs with only one redirect. You are better off using your server-side language like PHP or Python to create SEO-friendly URLs in the first place.

There are times when you will need to combine your coding language with .htaccess to achieve the desired result. But, don’t use the .htaccess file as a catch-all for URL issues.

Clean URL Rewrites Using NGINX

This article will cover how to easily implement Clean URLs (also known as Semantic URLs , RESTful URLs , User-Friendly URLs and Search Engine-Friendly URLs ) using NGINX Web Server; currently the second most popular web server platform worldwide.

Not using NGINX web server? No problem, Apache users can check out my article Clean URL Rewrites Using Apache.

A Clean URL is a URL that does not contain query strings or parameters. This makes the URL easier to read and more understandable to users.

Clean URLs are a high ranking factor for many search engines, but there are many other reasons why having Clean URLs can be important to your website. Check out our article Are your URLs Letting Your Website Down for more information.

THE PROBLEM

Take a look at this URL used by a dynamically generated web page:

The example shows a common URL structure that you may often see output from a PHP driven CMS (Content Management System). The URL links to a dynamic PHP web page called products.php with a variable value of 7. A dynamic web page is a web page that is constructed using server s >‘ >.

Functionally this URL is perfect; it links to the dynamic products page, and passes the variable ‘id’ with the value of 7 to the webserver to dynamically generate the content for the product associated with ID value of 7.

This is great but the URL seems a bit archaic, difficult to remember, it contains no relevant keywords for search engines; and it doesn’t describe the content of the webpage very clearly.

The webpage in the example is actually a shop page for buying organic apples, but there is no way of telling this from the current URL.

Here is an example of a more ideal URL:

Our example now contains multiple keywords for search crawlers and a clear description of the contents of the webpage.

But how is the webserver supposed to handle this Clean URL? There is no reference to the dynamic page ‘products.php’ or the variable ‘id’ of the product for the webserver to pull the correct dynamic content from.

THE SOLUTION

The easiest solution is to use a rewrite engine to modify the appearance of the URL, most commonly known as URL Rewriting or URL Manipulation .
Fortunately, this solution only requires minimal restructuring or renaming of folders, dynamic files or variables. URL Rewriting is very flexible and fast to implement.

WHAT IS URL REWRITING?


URL Rewriting is the technique used for URL mapping or routing in a web application. By using URL Rewriting we can provide the information our webserver requires to interpret our Clean URL in our previous example.

NGINX Web Server

Released in 2004 NGINX, pronounced ‘engine x’, has gained notable popularity in the past 12 years. NGINX became the second most used web server software worldwide in 2012, overtaking Microsoft ILS Web Server.
NGINX is available for Windows and many UNIX server operating system flavours.

INITIATING THE REWRITE ENGINE

NGINX comes with the pre-installed module ngx_http_rewrite_module , so there should be no initiating required, unless your install is missing the module.

To start creating rewrite rules in NGINX all we need to do is locate the NGINX configuration file that we will be adding our rules to.

  • Locate and open the NGINX configuration file nginx.conf with a text editor (Such as Sublime Text), the default location for the configuration depends on your installation type, the default locations are usually: /usr/local/nginx/conf , /opt/nginx/conf , /etc/nginx , or /usr/local/etc/nginx .
  • The configuration file will contain multiple code blocks. First navigate to the http block .
  • We are looking for the server block nested inside the http block. There may already be multiple server blocks, distinguished by the port the block listens to or server name.
    For now create a new empty server block, unless you already know which existing server block you would like to use for your rewrite rules.

BASIC URL REWRITING

To create a URL Rewrite we first need to create a location block inside the server block. The location block’s directives are tested against the URL specified in the request’s header.

We have for example the following URL:

But we want our users to instead be able to access this page via this URL:

We can create a basic location block to accomplish this rewrite:

Now let’s break this location block down and take a closer look at how it works.

    location = /photoshop-tutorials – This is the prefix matching for our location block, we have used an = sign in our syntax, this will mean that the location block will only match with a URL containing the exact prefix specified. The location block will match with the following URL: but not with: or If we remove the = sign, the location block will match with all of the above URLs or any URL that begins with “photoshop-tutorials” . You can also use location

for matching blocks with regular expressions or location

* for case-insensitive matches. We will be using these types of location block later in the article.

  • rewrite – Tells NGINX that the following refers to one single Rewrite Rule.
  • ^photoshop-tutorials?$ — The ‘pattern’ that the webserver will look for in the URL, if found the webserver will swap the pattern for the following substitution.
  • 1212aJlmo.html – The ‘substitution’, the webserver will swap the pattern for the substitution if the pattern is found in the URL.
  • ^, ? and $ — These are Regular Expression , also known as Rational Expression , characters; they are a sequence of characters that define a search pattern and are mainly used in pattern matching and string matching. The pattern is treated as a regular expression by default. In our example pattern we are using three regular expression characters:
    • ^ represents the beginning of a string.
    • $ represents the end of a string.
    • ? is known as the non-greedy modifier. In our example this modifier will stop our regular expression from repeating after matching our pattern for the first time, this is ‘non-greedy’ behaviour. ‘Greedy’ behaviour would be to look for more pattern matches.
  • break – This is known as a flag. Flags are added to the end of the rewrite rule and tell NGINX how to interpret the rule. In this example the break flag lets NGINX know not to execute any further location blocks if the current rule applies. There are more flags to choose from beyond this example; we will look at them in more detail later in the article.
  • Consequently when a user now inputs the URL:

    NGINX Web Server will display the following page, without the user knowing any different:

    We have now grasped the basic technique of rewriting a single URL to a different URL.

    However, our dynamic page has several variables, and using this technique means a lot of work duplicating rewrite rules for every variable.

    Мастер Йода рекомендует:  22 августа 2020 года Яндекс презентует новый поиск

    In the next section we will cover more advanced patterns that can solve this.

    DYNAMIC REWRITING USING BACK REFERENCES

    If we go back to our original problem:

    We have the variable ‘ > in this URL; but overall we have 150 products, each with a different ID.

    We want the URLs to look like the following example:

    It would take a long time to write individual rewrite rules for all of the possible URLs.


    By using the following ‘location prefix’, ‘pattern’ and ‘substitution’ we can save time and also avoid pages of duplicate code:

    Now let’s break this location block down and take a look at how it works:

    • location /product/ — The location block will match with any URL that begins with /product/.
    • ([0-9+]) – There are two key points to note on this part of the pattern.
      • Take a look at the contents of our brackets [0-9]+ . This is a regular expression, in a regular expression the square brackets [] mean match any of the contents. For example: if we were to use [1A] the regular expression would match for the characters 1 and A.
        Here the square brackets [] contain a range of characters: 0-9 which indicates all digits between and including 0 and 9 . The + symbol is a regular expression special character that has the special meaning of “match one or more of the preceding”. In the example pattern we have placed the + after our range [0-9] to detect one or more characters within our range; without the + the pattern will only match with one digit in our range, for example: 1 or 5 but not 11 or 15 .
      • The parentheses () in a regular expression refer to a backreference . The $1 in our substitution links to this backreference . For example, if the following URL was input: 127 would be matched to our range in the backreference , resulting in the following substitution: There can be multiple backreferences in a pattern, for example: Backreferences are numbered in the order they appear. In the example above there are two backreference groups in the pattern, the first group linking to $1 in the substitution and the second group linking to $2 . If we were to add a third group it would be numbered $3 and so on.
    • $1 – Is located in our substitution and links to our first backreference , which located in the parentheses in our pattern: ([0-9]+) .

    SOLVING THE PROBLEM USING REGULAR EXPRESSIONS

    The scope of what you can do with regular expressions is so large that it really deserves its own article, therefore, we will only focus on the regular expression we require for the following problem for now:

    We could successfully use the basic rewrite technique from the start of the article to create one single rule to rewrite this URL. However, in this situation we will assume, as in the previous section, that there are multiple products. We do not want to create individual rewrite rules for every product as this would be very time consuming.

    The problem here, that can’t be easily solved with NGINX Rewrite Rules, is finding the name of our product related to ‘ > in our database.

    This can be done with a PRG: External Rewriting Program but I highly recommend against doing so, as you may find using this technique will cause you countless problems such as: buffering issues, random results returned and many more undesirables.

    The problem will need to be resolved through the database and backend code. My recommended solution would be to add a new column to your database table labelled something similar to ‘product_name’ which will contain the name of the product.

    Your URL would then become, for example:

    The work required to make this change is minimal. Even if your website’s backend code relies heavily on the ‘id’ variable from the URL, you can still obtain this variable easily by querying the database at the start of your code to assign a variable for the >‘product_name’ is equal to ‘apples’ .

    In the previous section we created a dynamic rewrite rule using back references and a regular expression in our pattern to detect multiple digits. We are now going to create a similar rule using back references with a different regular expression pattern for use with our new ‘product_name’ variable.

    Previously our range was [0-9] . Because our new variable product_name contains letters instead of numerals we have changed the range to [a-z] : to detect all lower case letters between a and z .

    What if we want our range to also detect numerals, uppercase letters and even hyphens for product names with a hyphen separator such as ‘apple-juice’ ?

    You can simply add more characters to the range like so: [a-zA-Z0-9-] . Notice how the hyphen has been added to the end of the range, it is added to the end so that it is treated literally rather than as a range separator.

    SPECIAL CHARACTERS TO AWARE OF WHEN USING REGULAR EXPRESSIONS

    There are certain characters known as Special Characters that we need to be aware of when using regular expressions as they have special meanings.

    A good example of a regular expression special character is the period ‘.’ character.

    It is quite common for a period to be used in a pattern that includes a file, for example:

    This rewrite rule will work in substituting index.html for index.php .
    The problem is that it will also work for substituting index^html , indexshtml , index1html for index.php .

    This occurs because the period is a special character in a regular expression with the special meaning: “Any character” . Therefore, the rewrite rule will work with any character that substitutes the position of the period.

    To use a period as a Literal Character (i.e. without it’s special meaning) in a regular expression you need to ‘escape’ the period with a preceding back slash: ‘\.’ .
    For example:

    Note that we do not need to escape the period used in the substitution, as it is only the pattern that is treated as a regular expression in the rewrite rule.

    Regular Expression Special Characters include:

    • * (zero of more of the preceding)
    • . (any character)
    • + (one or more of the preceding)
    • <> (minimum to maximum quantifier)
    • ? (non-greedy modifier)
    • ! (negative modifier)
    • ^ (start of a string, or ‘negative’ if used at the start of a range)
    • $ (end of a string)
    • [] (match any of contents)
    • — (range when used between square brackets)
    • () (backreference group)
    • | (or)
    • \ (the escape character)

    The escape character (backslash) itself also needs to be escaped when used as a literal character. Depending on the programming language and parser you may need to use four backslashes instead of two ‘\\\\’ , this is because the programming language may also be using a backslash as an escape character.

    FLAGS

    Throughout this article we have used the flag break in our examples: to notify NGINX that no following rules should be applied when the current rewrite rule is in use.

    However, this is not the only flag at our disposable when creating rewrite rules. The following flags can also be used in your rules to notify NGINX of other information:

    • last (stops processing the current set of directives and starts a new search for a matching location with the changed URL)
    • break (set specified cookie, replace cookie with value)
    • redirect (returns a temporary redirect with code 302)
    • permanent (returns a permanent redirect with code 301)


    ORDER OF PRECEDENCE WITH LOCATION BLOCKS

    The order in which location blocks are processed is not as obvious as it may seem. Location blocks are not processed in cascading order, the type of location block determines the order of precedence. The order of precedence is as follows:

    1. Exact location blocks (location =)
    2. Regex locations blocks (location

    *)

  • Standard location blocks (location)
  • This is useful to know as you can create advanced rewrite rules with fall-back options by knowing the order of precedence.

    If we were to input the URL:

    All 3 of the above location blocks would match, but because we have specified an exact location block this block will be matched first and the break flag will stop the other location blocks from processing.

    SUMMARY

    If you also use Apache Web Server, make sure to check out my article on how to create Clean URL Rewrites Using Apache.

    If you require any help creating a specific rewrite rule or require any further information, please feel free to post your question in the comments.

    Generating SEO (and user) friendly URLs in your ASP.NET MVC Application

    Introduction

    Have a look at the following URL:

    Do you have any idea what the URL is pointing to by just looking at it? Most likely not.

    Now look at this one:

    Any idea? Well it would seem that it is an article about a hands-on with the Panasonic Lumix DMX-GX8.

    In today’s blog post I would like to demonstrate how to generate “friendly” URLs in your ASP.NET applications. Friendly not just in the sense that someone can look at it and figure out what the URL is pointing to, but more importantly friendly for search engines.

    Right now you may probably ask what difference it makes to search engines like Google what the URL of a page is? Surely a computer does not care? Well you would be sort of right, but the thing is that having keywords you are trying to rank for in the URL of a page, does indeed make a difference.

    There are many things which play a role in how Google determines the ranking of a web page, and no one can say for certain exactly how big a part each of those factors play. What most people agree on however is that having the keywords you want a page to rank for in the URL of the page does indeed help with the ranking of the page.

    In this blog post I am going to show you how to generate URLs in your application which is more SEO — and user — friendly.

    Generate URLs

    In my sample application I have created a fictitious product database which displays a listing of products, and allows a user to click on a specific product to navigate to the details page for that product.

    My product class is fairly simple:

    On the product listing page I simply display a list of products:

    And when the user clicks on a specific product they are navigated to a product details page:

    Take a look at the URL which we are generating:

    It is just a normal URL as you get in most ASP.NET MVC application which passes along the ID of particular database row to the controller action.

    We want to have something which is more user friendly. Something which at least also contains the name of our product. To generate a friendly URL I have created a new method on my Product class which generates a proper “slug” (or URL) for the product details page:

    And I have also updated my listing page to use the slug as the route parameter instead of the Id as it did before:

    Now when I navigate to the product detail page I can see that we have a proper “friendly” URL which contains the name of the product:

    Parameter binding

    But also notice that we have a new error. ASP.NET MVC is complaining that it was expecting an id route parameter, but could not find it. This is because the Details action on my controller is expecting an integer value for the id parameter:


    But instead of an integer, the part of our route now contains a string. The MVC framework is trying to convert the string to an integer, but cannot do it and therefore it is passing a null value along, and then complains that the id parameter cannot contain a null value.

    To fix this error we need to modify the RouteData for the route to fix up the value of the id route parameter.

    I have created a new >SeoFriendlyRoute that inherits from Route and have overr >GetRouteData method to clean up the id parameter. In my implementation I call the base GetRouteData method and check if the route data is not null, in which case it means that I have a match for my route.

    In this case I simple check whether the id parameter is present and if it is I use a regular expression to extract the first part of the URL that contains the actual numerical >id route value:

    And the last bit is to add a new route for the path /products/details/ to use my new SeoFriendlyRoute route class:

    And now when I refresh the page, the parameters are bound correctly as only the numeric part of the product URL that contains the actual product >id parameter:

    PS: If you need assistance on any of your ASP.NET Core projects, I am available for hire for freelance work.

    Reasons I like NSubstitute: Generate output based on input parameters

    Please update RSS feed subscription to this blog

    Friendly URL

    A friendly URL is a Web address that is easy to read and includes words that describe the content of the webpage. This type of URL can be «friendly» in two ways. 1) It can help visitors remember the Web address, and 2) it can help describe the page to search engines.

    Friendly URLs that are short and easy to remember are cons >www.[company].com/support/ » for the support section of their website. This is much easier to remember than a long convoluted URL, like » www.[company].com/section/support/default.aspx? >».

    Since dynamic sites often load different content based on the variables in the URL (which are usually listed after the question mark), creating user-friendly URLs is not always easy. Therefore, many webmasters now use a strategy called «URL rewriting» to create simpler URLs. This method tells the Web server to load a different URL than the one in the address bar. Therefore, a simple Web address can point to a more complex URL with lots of variables. Since the URL is redirected at the server level, visitors only see the simple Web address.

    Search Engine-Friendly URLs

    While user-friendly URLs are helpful for visitors, most webmasters are more concerned with creating search engine-friendly URLs. These URLs include important keywords that describe the content of the page. Since most search engines include the Web address as part of the information that describes a page, placing keywords in the URL can help boost the ranking of the page. Therefore, this strategy has become a popular aspect of search engine optimization or SEO.

    For example, a blog that includes tips for Windows 7 may have a URL like » blogger.blogger.com/2011/02/windows7.html «. A search engine friendly version of this URL may be » blogger.blogger.com/2011/02/helpful-tips-for-using-windows-7.html «. While this type of descriptive URL may help with search engine ranking, it is important not to create ridiculously long URLs just to describe the page. After all, search engines still focus primarily on the content of each page when indexing websites.

    Updated: February 11, 2011

    Cite this definition:

    TechTerms — The Tech Terms Computer Dictionary

    This page contains a technical definition of Friendly URL. It explains in computing terminology what Friendly URL means and is one of many Internet terms in the TechTerms dictionary.

    All definitions on the TechTerms website are written to be technically accurate but also easy to understand. If you find this Friendly URL definition to be helpful, you can reference it using the citation links above. If you think a term should be updated or added to the TechTerms dictionary, please email TechTerms!

    ‹ Friend | Frozen ›

    Subscribe to TechTerms

    Receive a daily definition and quiz question in your inbox.

    [PHP] Parsing search strings from referer urls?

    First time post, please be gentle.

    I’d like to be able to extract search strings from referer urls that come from search engines. (via php,
    of course) For example, http://www.google.com/search?q=foo+bar&ie=UTF-8&oe=UTF-8

    Now, I realize one might employ grep to pull out this information and its easy for a human to examine
    a url like the one above, but I’d like to programmatically present a nice, tidy list of search words used
    to generate the url.

    My gut says that one would need to write a function for each major search engine to parse out this
    information since each engine is unique in how it builds the url. However, I thought after poring over
    the manual and online docs/list and not finding a solution, and before I went off and reinvented the
    wheel, I would float the question and see what you fine folks have to say about the subject.

    Thanks in advance for any insight.

    Search Discussions

    • 3 responses
    • Oldest
    • Nested


    Jason Barnett at Jan 26, 2005 at 8:44 pm

    T.J. Mahaffey wrote:
    First time post, please be gentle.

    I’d like to be able to extract search strings from referer urls that come from search engines. (via php,
    of course) For example, http://www.google.com/search?q=foo+bar&ie=UTF-8&oe=UTF-8

    Now, I realize one might employ grep to pull out this information and its easy for a human to examine
    a url like the one above, but I’d like to programmatically present a nice, tidy list of search words used
    to generate the url.

    My gut says that one would need to write a function for each major search engine to parse out this
    information since each engine is unique in how it builds the url. However, I thought after poring over
    the manual and online docs/list and not finding a solution, and before I went off and reinvented the
    wheel, I would float the question and see what you fine folks have to say about the subject.


    Teach a man to fish.

    Jennifer Goodie at Jan 26, 2005 at 10:53 pm
    Jason Wong at Jan 26, 2005 at 10:54 pm

    On Thursday 27 January 2005 04:16, T.J. Mahaffey wrote:
    First time post, please be gentle.

    I’d like to be able to extract search strings from referer urls that come
    from search engines. (via php, of course) For example,
    http://www.google.com/search?q=foo+bar&ie=UTF-8&oe=UTF-8

    Now, I realize one might employ grep to pull out this information and its
    easy for a human to examine a url like the one above, but I’d like to
    programmatically present a nice, tidy list of search words used to generate
    the url.

    My gut says that one would need to write a function for each major search
    engine to parse out this information since each engine is unique in how it
    builds the url. However, I thought after poring over the manual and online
    docs/list and not finding a solution, and before I went off and reinvented
    the wheel, I would float the question and see what you fine folks have to
    say about the subject.

    Friendly URL in custom php code

    Budget €30-250 EUR

    In custom php web site I would like to make friendly URL.

    Now the url is [login to view URL]

    Project ID: #7196478

    Looking to make some money?

    Set your budget and timeframe

    Outline your proposal

    Get paid for your work

    It’s free to sign up and bid on jobs

    Hello. In [login to view URL], I have converted the news area to User Friendly URLs like this: [login to view URL] Its in asp.net. More

    23 freelancers are bidding on average €118 for this job

    I can do this let me check your code how the website is coded customly , I am very interested in this job and willing to work with you. I would appreciate if you could visit our portfolio for examples and feedba More

    A proposal has not yet been provided

    Hello, Based on the requirement for friendly URL, very quickly allow me to introduce myself. I am professional and experienced responsive website developer. I have expertise in designing, developing, and customizing More

    SEO Project Portfolio Request real time work samples when selecting an SEO outfit. Many don’t want to show you actual achievements. We don’t want you to take our word; we want to show you our SEO work in action. More

    Hello, Client’s smile is our first milestone. I have gone through the post and yes I will help you for sure to complete the project successfully in time and with quality. Please go through [login to view URL] More

    A proposal has not yet been provided

    HI sir, i read your job details and i am sure that i can do this work but above URL is not opening and could you please conform me that which platform/freamwork you [login to view URL] talk to me so that we can discuss in More

    Hi, My name is Shahid, i am working in web development from past four years. Will provide you quality work and hopefully we will have long term relationship in web development. Let me know if i can help you. Thank More

    Hello Buyer, I am available right now and can discuss right away. Lets discuss it and get it done. Thanks

    Hello, Di you try yourself use .htaccess file? some site gen file automaticly: [login to view URL] [login to view URL] [login to view URL] C More

    Ruby on Rails: user friendly URLs

    В данной статье будет показан пример как сделать красивые ссылки в Rails проекте. Ссылки вида /posts/1/ будут преобразованы в /posts/1-article-name/

    Подготовка

    Начнем с того что поставим рельсы последней версии, выполнив в консоли gem install rails -v=3.1.3
    После окончания процесса установки джема, создаем новый проект командой rails new nice_urls . В результате у нас новый чистый проект со всеми установленными джемами, благодаря тому, что в конце генерации проекта автоматически был запущен bundler.

    Создание постов

    Для демонстрации нам подойдет обычный скафолдинг. Давайте сгенерируем его для статей у которых будет название и текст:
    rails g scaffold Post title:string text:text
    Для внесения изменений в БД, выполним миграцию командой rake db:migrate . Теперь можно запускать сервер (команда rails s ) и смотреть на то, что у нас сейчас есть по адресу localhost:3000/posts
    Перед нами появится привычный интерфейс добавления постов:

    Мастер Йода рекомендует:  9 плагинов WordPress для оптимизации изображений
    Добавить комментарий