URLs Scrapper: How to Extract all URLs from a Web Page using PHP

extract-all-url-llinks-from-a-web-page-using-php

It is sometimes required to extract all URLs from web page on demand.Lots of online tools are available to extract all URLs from a web page.This tutorial shows how to do that using PHP Script.This type of scrapper is may use to build sitemap generator or fetching Google result links.

To extract web URLs List from a web page i am going to use DOMdocument class of PHP. The DOM parser functions are part of the PHP core. No installation needed to use these functions.DOMDocument is very good at dealing with HTML as well as XML.In PHP, Just you need to create an instance of DOMDocument and initialize it, and set its version and character encoding.Let See how to do this.

<?php
$url = "https://techalltype.com/myblog";
        $html = file_get_contents($url);
        $doc = new \DOMDocument('1.0', 'UTF-8'); /* instance of DOMDocument */
        @$doc->loadHTML($html); /*The function parses the HTML contained in the string source */
        $xpath = new \DOMXpath($doc); /*to retrieve selected html data */
        $nodes = $xpath->query('//a');
        foreach($nodes as $key => $node) {
            echo $key++.".) ".$node->getAttribute('href')."<br/>";
        } 
?>

Here you can find more about on PHP DOMDocument.

​That’s All,thank you for reading this post. we hope you like this Post, Please feel free to comment below, your suggestion.

I am Hitesh from Jamshedpur (India). I have been working as a Web Application Developer from last 4+ years. I love diverse and attention catching web presence for a variety of users. Also I love to learn new things in Web Development.

Tags: , , , , , ,
error: Content is protected !!
Secured By miniOrange