This is not a tutorial, but more like sharing a nice geeky road-trip ;)
I have a pretty good understanding of the Youtube Data API, as I have actively used it on portalZINE TV in the past, to upload videos and dynamically link them to my local post-types.
For one of my latest customer projects (TYPEMYKNIFE / typemyknife.com), the task was a bit more complicated and the goal was to make it as future-proof as it can be with the Google APIs :)
Prerequisites / References to get you started:
The goal for the setup was to actively synchronize WooCommerce products with linked / attached videos, with their source at Youtube.
As the website is multilingual, WPML integration is critical as well. And as Youtube allows localization of title and description, that can be added into the mix quiet easily in the future ;)
The following product attributes should be mirrored and optimised for Youtube:
The following attributes should be integrated into the description to enrich the Youtube description:
All of these attributes will be collected internally and assigned using a simple template system, which allows the customer to move parts around freely and freely layout the description for Youtube.
The following stats will be collected for review:
Youtube SEO
These are the relevant key aspects, that help to get your videos more views.
In the past access to the Youtube Data API was far easier and less limited, when it comes to offline / none expiring OAuth2 refresh tokens.
When you are building a server-side application that is only available to your customer or moderators, it makes no sense to run that app through the Google App verification. Your app will never be used in public.
The Youtube Data API and its scopes, are defined as sensitive and therefor require third-party security assessment for public access.
The scopes I am requesting are https://www.googleapis.com/auth/youtube.upload + https://www.googleapis.com/auth/youtube.
Because of that its far easier to just setup OAuth 2 in test mode and restrict access to your customer and specific additional accounts only (up to 100 test users allowed). What all these account need, is access to your own or Brand Youtube Channel.
Preparation in the Google Cloud Console:
A detailed description can be found here.
You can circumvent verification for the consent screen, by using an organisation setup at Google. Here some infos about that. With that setup offline refresh tokens should work fine.
Update: Just tried that, but wont work with a branded youtube account, even though the cloud user has admin access to it. Not giving up yet, but Google / Youtube really makes it difficult to just have a simple offline solution for specific tasks ;) BTW also forced the login hint, to make sure the right account is logged in : $client->setLoginHint(‚YourWoreksapceAccount‘); !
You might have heard of the „The League of Extraordinary Packages„. It is a group of developers who have banded together to build solid, well tested PHP packages using modern coding standards.
They also offer an OAuth2-client + OAuth2 Google extension that can be used.
On the server, the Google API PHP SDK can be easily integrated using Composer.
In my customer plugin I neatly separated all relevant areas in classes & traits:
You can check the expiry time of your access token by accessing:
https://www.googleapis.com/oauth2/v1/tokeninfo?access_token=YOUR_TOKEN„A Google Cloud Platform project with an OAuth consent screen configured for an external user type and a publishing status of „Testing“ is issued a refresh token expiring in 7 days.“ – Google
Basic Auth example from the SDK:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 |
<?php // Call set_include_path() as needed to point to your client library. set_include_path($_SERVER['DOCUMENT_ROOT'] . '/directory/to/google/api/'); require_once 'Google/Client.php'; require_once 'Google/Service/YouTube.php'; session_start(); /* * You can acquire an OAuth 2.0 client ID and client secret from the * {{ Google Cloud Console }} <{{ https://cloud.google.com/console }}> * For more information about using OAuth 2.0 to access Google APIs, please see: * <https://developers.google.com/youtube/v3/guides/authentication> * Please ensure that you have enabled the YouTube Data API for your project. */ $OAUTH2_CLIENT_ID = 'XXXXXXX.apps.googleusercontent.com'; $OAUTH2_CLIENT_SECRET = 'XXXXXXXXXX'; $REDIRECT = 'http://localhost/oauth2callback.php'; $APPNAME = "XXXXXXXXX"; $client = new Google_Client(); $client->setClientId($OAUTH2_CLIENT_ID); $client->setClientSecret($OAUTH2_CLIENT_SECRET); $client->setScopes('https://www.googleapis.com/auth/youtube'); $client->setRedirectUri($REDIRECT); $client->setApplicationName($APPNAME); $client->setAccessType('offline'); // Define an object that will be used to make all API requests. $youtube = new Google_Service_YouTube($client); if (isset($_GET['code'])) { if (strval($_SESSION['state']) !== strval($_GET['state'])) { die('The session state did not match.'); } $client->authenticate($_GET['code']); $_SESSION['token'] = $client->getAccessToken(); } if (isset($_SESSION['token'])) { $client->setAccessToken($_SESSION['token']); echo '<code>' . $_SESSION['token'] . '</code>'; } // Check to ensure that the access token was successfully acquired. if ($client->getAccessToken()) { try { // Call the channels.list method to retrieve information about the // currently authenticated user's channel. $channelsResponse = $youtube->channels->listChannels('contentDetails', array( 'mine' => 'true', )); $htmlBody = ''; foreach ($channelsResponse['items'] as $channel) { // Extract the unique playlist ID that identifies the list of videos // uploaded to the channel, and then call the playlistItems.list method // to retrieve that list. $uploadsListId = $channel['contentDetails']['relatedPlaylists']['uploads']; $playlistItemsResponse = $youtube->playlistItems->listPlaylistItems('snippet', array( 'playlistId' => $uploadsListId, 'maxResults' => 50 )); $htmlBody .= "<h3>Videos in list $uploadsListId</h3><ul>"; foreach ($playlistItemsResponse['items'] as $playlistItem) { $htmlBody .= sprintf('<li>%s (%s)</li>', $playlistItem['snippet']['title'], $playlistItem['snippet']['resourceId']['videoId']); } $htmlBody .= '</ul>'; } } catch (Google_ServiceException $e) { $htmlBody .= sprintf('<p>A service error occurred: <code>%s</code></p>', htmlspecialchars($e->getMessage())); } catch (Google_Exception $e) { $htmlBody .= sprintf('<p>An client error occurred: <code>%s</code></p>', htmlspecialchars($e->getMessage())); } $_SESSION['token'] = $client->getAccessToken(); } else { $state = mt_rand(); $client->setState($state); $_SESSION['state'] = $state; $authUrl = $client->createAuthUrl(); $htmlBody = <<<END <h3>Authorization Required</h3> <p>You need to <a href="$authUrl">authorise access</a> before proceeding.<p> END; } ?> <!doctype html> <html> <head> <title>My Uploads</title> </head> <body> <?php echo $htmlBody?> </body> </html> |
A simple upload example can be found here .
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 |
try{ // REPLACE this value with the video ID of the video being updated. $videoId = "VIDEO_ID"; // Call the API's videos.list method to retrieve the video resource. $listResponse = $youtube->videos->listVideos("snippet", array('id' => $videoId)); // If $listResponse is empty, the specified video was not found. if (empty($listResponse)) { $htmlBody .= sprintf('<h3>Can\'t find a video with video id: %s</h3>', $videoId); } else { // Since the request specified a video ID, the response only // contains one video resource. $video = $listResponse[0]; $videoSnippet = $video['snippet']; $tags = $videoSnippet['tags']; // Preserve any tags already associated with the video. If the video does // not have any tags, create a new list. Replace the values "tag1" and // "tag2" with the new tags you want to associate with the video. if (is_null($tags)) { $tags = array("tag1", "tag2"); } else { array_push($tags, "tag1", "tag2"); } // Set the tags array for the video snippet $videoSnippet['tags'] = $tags; // Update the video resource by calling the videos.update() method. $updateResponse = $youtube->videos->update("snippet", $video); $responseTags = $updateResponse['snippet']['tags']; $htmlBody .= "<h3>Video Updated</h3><ul>"; $htmlBody .= sprintf('<li>Tags "%s" and "%s" added for video %s (%s) </li>', array_pop($responseTags), array_pop($responseTags), $videoId, $video['snippet']['title']); $htmlBody .= '</ul>'; } } catch (Google_Service_Exception $e) { $htmlBody .= sprintf('<p>A service error occurred: <code>%s</code></p>', htmlspecialchars($e->getMessage())); } catch (Google_Exception $e) { $htmlBody .= sprintf('<p>An client error occurred: <code>%s</code></p>', htmlspecialchars($e->getMessage())); } |
All operations to and from the Youtube Data API are rate limited. What is important for us, are the queries per day.
The default quota is 10.000 queries per day, sounds a lot, but is easily gone after updating 150-200 videos. You can request this limit to be raised, but again a lot of paperwork and questions that are just not needed.
The above limit just means, that you need to cache as many queries as possible, to only query live when needed ;)
Something you learn fast, when experimenting with different things! I hit that limit multiple times in the first few days, with around 500 videos in the queue.
Different operation cost you different amount of units
It also helps to use the Google Developer Playground to testdrive the Youtube Data API with your own credentials while optimising your own code.
You can define your own OAuth 2.0 configuration by clicking the cog in the upper right corner.
I setup the bulk updating to allow splitting it over multiple days, if required. For this an offline refresh token is needed, as the standard token expires after 60 minutes.
My customer can also just update a single video, when changes are applied to the product or a new product has been added.
If more frequent updates are required, I will ask for a raise of the queries per day. You can circumvent the limit by using multiple Google Cloud Platform accounts with new OAuth credentials, but really an overkill right now. I have done that in the past ;)
The GUI is just based of Bootstrap, to make it simple and clean. Using my own wrapper to make it work within the WordPress admin.
For all ajax operations, I am using htmx and _hyperscript, which I will talk about in another article in the future.
Really neat and clean way to build single page interfaces.
The whole plugin runs of its own REST API endpoint. Just love using WordPress as a headless system.
I used TWIG / Timber for the templates, to separate logic and layout. Timber has been my goto solution for years now. It drives my own and many customer websites.
This has been a lot of fun, maybe a bit too much LOL
I do geek-out about many of my projects, but this experience helped me to bring my WordPress toolbox to the next level. This will help to drive other things in the future.
Working so deeply with the Youtube Data API has been fun and feels so easy now, after all remaining problems have been solved.
Would have loved this during my portalZINE TV days ;)
I you read all this, you just earned yourself a badge for completion ;)
Need something similar or something else? Just say hi and we can talk.
„WPML (WordPress Multilingual) makes it easy to build multilingual sites and run them. It’s powerful enough for corporate sites, yet simple for blogs.“ – WPML
I have been running and setting up multilingual websites for more than 12 years. WordPress and related integrations have gladly come a long way to make our life’s a lot easier.
For basic content WPML is almost plug & play, but I do see more and more sites / customers struggling with more complex setups. WPML is one of the most popular multilingual plugins and is used on x00.000 of websites.
Just so you know, WPML is a commercial solution!
The amount of settings has increased a lot over the years and offers possible solutions for almost any content / plugin setup.
But for more complex setups, I would suggest to hire a professional to look over the settings or study the plugin documentation carefully.
Especially with a lot of content, it can quickly increase problems and the need to revisit specific content over and over again.
WPML lets you translate any text that comes from themes / theme frameworks (DIVI, Elemetor, Gutenberg …), plugins, menus, slugs, SEO and additionally supported integrations (Gravity Forms, ACF, WooCommerce …).
You can translate content internally for yourself, using translation management to translate with an internal team of translators or get help from external translators / translation services.
The latest version also offers AI translations, which allows you to get a decent start for most of your content.
In addition to the above, WPML String Translation allows you to translate texts that are not in posts, pages and taxonomy. This includes the site’s tagline, general texts in admin screens, widget titles and many other areas.
Well, I am a bit biased. I have not looked much at other solutions for the past 5 years, as it offers all I really need.
I have used it on projects from 2 to 15 languages and it scales nicely. At least with proper hosting attached!
Anything can be tweaked through the API, Hooks and custom integrations. I have build additional WPML tools for my customers, to streamline some of the repeating / boring tasks.
Their support is responsive and the forum already provides a huge amount of answers to most of the questions that might come up.
If you develop / maintain multiple customer websites with multilingual content, the investment is quickly
amortized. I do offer WPML to my maintenance package customers, maybe something to consider ;)
Its an essential solution in my WP toolbox.
WPML 4.5 is on its way and will include a „Translate Everything“ feature, among other fixes and enhancements.
Translate Everything allows you to translate all of your site’s content automatically as you create it. You can then review the translations on the front-end before publishing.
Updated 25.03. : Some function names changed in the latest beta version.
ACF 5.8 Beta introduced an easy way to create your custom Gutenberg blocks. I am already using it heavily for a current project, to easily organize content and media assets.
Really powerful, when combined with Timber as well, which has been the foundation of many of my themes for years now ;)
Organizing data using ACF is nice, but sometimes you seek access to that saved block data directly. I hate it when I am confined to boundaries and the data flow is restricted or hidden. I need things to be accessible to choose the creative flow myself.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 |
// Parse blocks from post content $blocks = parse_blocks($your_post_content); $collect = array(); // Loop through the blocks foreach($blocks as $block){ //Setup global block post data context // before: acf_setup_postdata acf_setup_meta( $block['attrs']['data'], $block['attrs']['id'], true ); // Get ACF fields $fields = get_fields(); // I am using this to organize my assets. // Each block of mine has a unique identifier as its first field: // $uid = $block['attrs']['data'][array_keys($block['attrs']['data'])[0]] // I would do: // $collect[$uid] = $fields; // Collection of fields using the block id. $collect[$block['attrs']['id']] = $fields; // Restore global context // before: acf_reset_postdata acf_reset_meta( $block['attrs']['id'] ); } |
There you go, enjoy some free block data :)
I was a big skeptic, when it comes to WordPress and the new Gutenberg editor, but combined with ACF + Timber its pure magic :) Looking forward to things to come!
Cheers
Alex
Extended example:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 |
$collect = []; $blocks = parse_blocks($your->post_content); foreach($blocks as $block){ if( isset($block['attrs']['data']) && !empty($block['attrs']['data'][array_keys($block['attrs']['data'])[0]])){ acf_setup_meta( $block['attrs']['data'], $block['attrs']['id'], true ); $fields = get_fields(); acf_reset_meta( $block['attrs']['id'] ); $collect[$block['attrs']['data'][array_keys($block['attrs']['data'])[0]]] = array('render' => render_block( $block ), 'field' => $fields, 'block' => $block ); }else{ $collect['main'] .= render_block( $block ); } } |
The $collect array will hold all data, including all ACF fields. You will have full access to any field, including repeater fields. The $collect[‚main‘] will just collect the standard post content.