Understanding and Optimizing PHP Executions

PHP executions are a crucial aspect of web development, especially when it comes to performance optimization. However, a high number of PHP executions can lead to issues. This article will explore the reasons behind increased PHP executions and provide steps to address them.

optimizing php executions

The Root Cause: Web Requests

The primary cause of PHP executions is web requests serviced by the Web Server. These requests can come from actual visitors or Web Robots. Web Robots, designed to access a website more frequently than a human user, can significantly increase the number of PHP executions.

Step 1: Identify the Robot

To identify high traffic from Web Robots, you can use the AwStats feature of the cPanel service. If the ‘Not Viewed Traffic’ (traffic generated by Web Robots) is significantly higher than the ‘Viewed Traffic’, it indicates an issue that needs addressing.

Step 2: Blocking the Robot

Robots can be categorized into two types – Search Engines and Unknown Robots. To reduce traffic from Search Engines, you can instruct them to crawl the website less frequently. For Unknown Robots, you can use the robots.txt file to block or limit their access.

Unprotected Forms and Their Impact on PHP Executions

Unprotected forms on a website can serve as a gateway for malicious bots, leading to an exponential increase in PHP executions. This is because these bots can send direct POST requests with data based on different types of forms such as login forms, registration forms, contact forms, and data collecting forms. Each POST request is often processed by a PHP script, and every request is counted as a script execution.

For instance, consider a scenario where a bot performs a brute force attack on a login form. It can send over 1000 requests per minute, resulting in 1000 script executions. Similarly, if a bot abuses a registration form on a website, the number of requests can be 1000 per minute, resulting in not only 1000 script executions every minute but also the creation of numerous fake profiles on the website.

Step 1: Locating the Unprotected forms

Identifying unprotected forms is the first step towards addressing this issue. The easiest way to do this is to look into the “Pages-URL” section of the AwStats Tool. This section displays a list of the top 25 URLs that are being accessed on your website. If login pages or registration pages appear at the top of the table, it’s likely that these are being exploited.
However, if you are the developer of the website, you will likely already be aware of all the forms that are not being protected. This knowledge can be invaluable in identifying and addressing potential vulnerabilities.

Step 2: Protecting the forms

Once the unprotected forms have been identified, the next step is to secure them. The most common mechanism for protecting forms is the implementation of Recaptcha verification. This feature presents a human verification challenge each time a form is submitted. If the challenge is not fulfilled, which is typically the case when a bot sends the request, the request is terminated.

Most open-source platforms already have such protection. However, for others, you may need to implement a plugin or a module that enhances all the forms with the Recaptcha challenge. If there is no plugin or module available for such protection, you may need to contact an expert web developer who can secure the forms.

reduce php executions

The Impact of Dynamically Generated Elements on PHP Executions

Every web page of a website consists of web requests made to resources on the same or on a remote web server. In many open-source platforms, the CSS and JS files are dynamically generated and then included on the website. Another common example is the use of iframes, often used for displaying content outside the scope of the webpage on which the iframe is displayed.

If there are such elements on a webpage, they can cause separate script executions, depending on the structure of the platform. Another negative aspect is that these elements can respond with 404 errors if they cannot be found on the source from which they have been requested. Considering the explanation for the custom error pages, this can also lead to additional script executions solely for the generation of 404 responses.

Step 1: Identifying faulty web requests

The first step in addressing this issue is to identify the faulty web requests. Tools like Pingdom can be extremely helpful in this regard. Once you submit the URL of the page that needs to be checked, the tool provides a report page containing detailed information. The sections that matter here are the ‘Response codes’ section and the ‘File requests’ section.

The ‘Response Codes’ section outlines all the response codes of each request performed on the webpage. The ‘Connection Error’ line is particularly important as it shows the number of such errors.

Step 2: Resolving faulty web requests

Once you have identified the faultyweb requests, the next step is to resolve them. This process will depend on the nature of the requests. If the requests are for static resources like CSS or JS files, you can often resolve the issue by changing their URLs to the correct ones.
If the requests are for dynamic resources, you will need to investigate why the scripts being loaded are improperly executed.

This might involve examining the code of the scripts, checking the server configuration, or looking into other aspects of your website’s setup. This process can be complex and may require some development knowledge. If you’re not comfortable with this, it might be advisable to hire an expert web developer.

php executions

Misconfigured Cron Jobs and Their Effect on PHP Executions

Cron jobs are tasks that are scheduled to run automatically at specified intervals on a server. They are a powerful tool for automating tasks, but if they are misconfigured, they can result in a constant stream of script executions, leading to a high number of PHP executions.

One common issue is the use of the Linux ‘wget’ command in cron jobs. The ‘wget’ command is often used for retrieving web resources via the HTTP, HTTPS, and FTP protocols. It performs a simple GET request to a given URL, and the web server responds with the content of the page that has been requested. ‘Wget’ then saves this content on the file system.

However, ‘wget’ does not consider how the result is generated. This means that the response could be generated dynamically, which would involve a script execution, or by direct static content delivery, which would not. If a cron job is using ‘wget’ and is configured to run every minute, this could result in 60 additional script executions per hour, 1,440 script executions per day, or 43,200 script executions per month.

Step 1: Identifying misconfigured Crons

The first step in addressing this issue is to identify any misconfigured cron jobs. You can see and configure the cron jobs for your web hosting account via your cPanel → Cron Jobs. Under the ‘Current Cron Jobs’ section, you will see a list of all the cron jobs that are configured for your web hosting account.

Step 2: Repairing misconfigured Cron Jobs

If you find any cron jobs using ‘wget’, you should consider editing them. One recommendation is to change the ‘Common Settings’ for the time the job is executed to “Once an hour (0 * * * *)” and to change the command from ‘wget’ to the PHP executable. This can be done if you know the path to the cron script that needs to be executed.

For example, here are the cron job commands for some of the most used open-source platforms:

  • WordPress: /usr/local/bin/php -q wp-cron.php
  • Magento Cron Jobs: cd /magento/root/folder && /usr/local/bin/php bin/magento cron:run && /usr/local/bin/php bin/magento cron:run
  • Drupal: /usr/local/bin/php -q /drupal/root/folder/cron.php

By correctly configuring your cron jobs, you can significantly reduce the number of unnecessary PHP executions and improve the performance of your website.

Table of Contents