I AM LISTENING TO
|
WAS ICH LIEBE
  • Englisch
  • Deutsch


BLOG FILTER



Archaeology Travel

I had the chance this year to meetup with my client Thomas Dowson from „Archaeology Travel Media“ at the Travel Innovation Summit in Seville.

Over the past 2 years we have been revamping all the content from archaeology-travel.com and integrated a sophisticated travel itinerary builder system into the mix. We are almost feature complete and are currently fine-tuning the system. New explorers are welcome to signup and testdrive our set of unique features.

It was so nice to finally meet the whole team in person and celebrate what we have accomplished together so far.

What is Archaeology Travel all about?

Directly taken from the front-page :)

EXPLORE THE WORLD’S PASTS WITH ARCHAEOLOGY TRAVEL GUIDES, CRAFTED BY EXPERIENCED ARCHAEOLOGISTS & HISTORIANS

Whatever your preferred style of travel, budget or luxury, backpacker or hand luggage only, slow or adventure, if you are interested in archaeology, history and art this is an online travel guide just for you.

Here you will find ideas for where to go, what sites, monuments, museums and art galleries to see, as well as information and tips on how to get there and what tickets to buy.

Our destination and thematic guides are designed to assist you to find and/or create adventures in archaeology and history that suit you, be it a bucket list trip or visiting a hidden gem nearby.“

More Details
About
Mission & Vision
Code of Ethics

What is next?

We are constantly expanding our set of curated destinations, locations and POIs. Our plan is it, to make it even easier to find unique places for your next travel experience.

We are also working on partnerships to enhance travel options and offer a even broader variety of additional content.

Looking forward to all the things to come, as well as to the continued exceptional collaboration between all team members.

Lets continue to help bring the world’s past to the future :)

readmore

EAT what does that stand for?

In SEO (Search Engine Optimization), there is a concept called „E-A-T“ which stands for „Expertise, Authoritativeness, and Trustworthiness“.

Google uses E-A-T as one of its many ranking factors to evaluate the quality of content on the web. Websites that consistently produce high-quality content that meets the E-A-T criteria are more likely to rank well in search engine results pages (SERPs).

  1. Expertise: Refers to the level of knowledge, skill, and experience demonstrated by the content creator. Google looks for content that demonstrates a high degree of expertise in the relevant subject matter.
  2. Authoritativeness: Refers to the reputation and authority of the content creator or the website publishing the content. Google looks for content created by individuals or organizations with a strong reputation or authority in the relevant field.
  3. Trustworthiness: Refers to the level of trust users have in the content creator or website publishing the content. Google looks for content that is transparent, accurate, and free from any misleading or deceptive information.

E-A-T is not a specific algorithm or ranking factor that Google uses, but rather a framework or set of guidelines that Google’s quality raters use to evaluate the quality of content on the web. These quality raters assess the content using E-A-T as a benchmark, and their feedback helps Google improve its search algorithms.

It is especially important for content that falls under the category of YMYL (Your Money or Your Life), which includes content related to health, finance, legal, and other topics that can impact people’s well-being or financial stability. Google holds this type of content to a higher standard because inaccurate or misleading information can have serious consequences.

Building E-A-T for your website or content involves a multi-faceted approach that includes creating high-quality, informative, and engaging content, establishing your expertise and authority in your field, and building trust with your audience through transparency and honesty.

Some specific actions you can take to improve your website’s E-A-T include showcasing the expertise of your content creators, publishing authoritative and accurate content, providing clear and transparent information about your business, and building a positive online reputation through reviews, testimonials, and other forms of social proof.

It’s important to note that E-A-T is just one of many factors that Google uses to determine search rankings, and it’s not the only factor. Other factors that can influence search rankings include content relevance, website speed and performance, user experience, and backlinks.

So what is EEAT than ?

Just another E, for Experience. 2023 Google wants to see that a content creator has first-hand, real-world experience with the topic discussed. Which means that content and there creators are becoming far more entwined. Further pushing the trend for authoritative quality content for audiences.

  1. Author experience is a key factor in determining the expertise and authority of a website’s content. Google evaluates the author’s credentials, education, experience, and overall reputation to determine whether they are qualified to write about a particular topic.
  2. Google looks for signals that indicate the author has a high level of expertise in their field, such as relevant education, professional certifications, industry awards, or other forms of recognition.
  3. Author experience can also be demonstrated through the author’s online presence and activity. For example, if the author has published articles in reputable publications or has a strong social media following, this can signal to Google that they have a strong online reputation and are respected in their field.
  4. Building author experience involves investing in the development of your content creators and encouraging them to build their online reputations. This can include providing opportunities for training, networking, and professional development, as well as encouraging them to build their personal brands through activities like blogging, speaking at conferences, and participating in online communities.
  5. One effective way to build author experience is to showcase the author’s credentials and expertise on the website. This can include author bios, headshots, and information about their educational and professional background.

So back to a solid author bios, detailed author pages and all relevant links that detail an authors expertise. Thank you! Search is really shifting and things are changing rapidly. Real content is queen or king again :)

Enjoy coding …

readmore

THE QUESTION

A while back a potential customer asked me, if it is possible to restructure a WordPress Multisite setup and WPML with a more simplified and custom url structure?


THE ROUGH IDEA

1 . BASE.website web.site (with possibly different languages)

web.site/de/
web.site/en/

2. SUB.website web.site/nl-nl/

Languages would normally be added like this:

web.site/nl-nl/de/
web.site/nl-nl/en/

The customer wanted it to be restructured / simplified like this:

web.site/de-nl/
web.site/en-nl/

This basically mimics the structure of a single WPML website with custom languages, but with all the benefits of a multisite.

THE SOLUTION

This is nothing that WPML or WordPress Multisite provides out of the box.
I built a prototype setup to make it work.

Not something that I would propose for anyone, as it requires a lot of tweaks for anything that handles dynamic links (plugins, hooks, core systems, page.builder …)

Its doable :)

BASIC URL HANDLING

One thing that needs to be tweaked globally, is the mapping of the new url structure.

So web.site/nl-nl/en/ needs to become web.site/en-nl/

This needs to be handled on the server side, by proxying the original to the new structure.
This can be easily done using Apache or NGINX.

With that web.site/nl-nl/en/ will be proxied to web.site/en-nl/, but any core navigation will not work yet.

This is the fastest solution that I came up with, within the hour I gave myself ;)

There surely are other options, like the core rewrites / restructuring of the core shorturl handling. But these approaches might break things in far more areas.

Using the proxy approach, keeps the core as it is. The solution needs to be as simple as possible, allowing to maintain it in the future :)

HOOKS TO THE RESCUE

Just for the basic setup a couple of hooks are required to make this work, more might be needed depending on the plugins in use.

Here a couple of examples ….
WordPress site_url

WordPress Nav Links

WPML

Rankmath

This will not cover every angle, but will give you a starting point! I love my puzzles and there always is a viable solution :)

Need something similar … get in touch!

Happy coding …

readmore

When you look at Youtube, Twitter, certain Facebook groups and even some software companies, they are all building up fear for the upcoming / in-progress Google „Helpful Content“ algorithm update.

WHAT IS IT?

– Our “helpful content update” launching next week will better surface original, helpful content made by people, for people, rather than content made primarily to gain search traffic. It’s part of a broad effort to show more unique, authentic info in results – Google SearchLiaison@Twitter

WHEN IS IT HAPPENING?

Its happening as I write this and its about time!

The goal of this update is to rank websites that publish original and unique content. Content written by real writers and not AIs.

This also downgrades websites, that write about content that is not relevant to their core expertise. So no more content domain dominance, by posting about every possible content angle to lure visitors in.

Also old content, when not updated regularly, will loose prominence.

This all is a plus for the enduser and knowledge seeker. Google has been preparing for this for years now and its not happening suddenly.

Structured data gained more and more importance over the past few years. Google is finally using it to cleanup search!

HOW GOOD WILL IT BE?

Hard to tell. But change was needed! There are so many underrated websites out there, that deliver quality content, but never got a chance to bubble up or shine :) This will hopefully get us better search results and better quality control.

LOOKING FORWARD TO THE RESULTS!
Keep on breathing …. ALEX

readmore

INTRODUCTION

This is not a tutorial, but more like sharing a nice geeky road-trip ;)

I have a pretty good understanding of the Youtube Data API, as I have actively used it on portalZINE TV in the past, to upload videos and dynamically link them to my local post-types.

For one of my latest customer projects (TYPEMYKNIFE / typemyknife.com), the task was a bit more complicated and the goal was to make it as future-proof as it can be with the Google APIs :)

Prerequisites / References to get you started:

portalZINE NMN | Development meets Creativity | youtube data api

THE GOALS

The goal for the setup was to actively synchronize WooCommerce products with linked / attached videos, with their source at Youtube.

As the website is multilingual, WPML integration is critical as well. And as Youtube allows localization of title and description, that can be added into the mix quiet easily in the future ;)

The following product attributes should be mirrored and optimised for Youtube:

  • Product Title
  • Product Description (5000 character limit at Youtube)
    My customer already has a pretty long and detailed description, which is perfect for Youtube!
    We average around 2900-3000 characters.
  • Product Tags (500 character length limit at Youtube)

The following attributes should be integrated into the description to enrich the Youtube description:

  • Introduction
  • Product Link to WooCommerce Product / Shop
  • Socials
  • Legal Information
  • Hash Tags (local post-type to add recurring / important hash tags)
  • Outro

All of these attributes will be collected internally and assigned using a simple template system, which allows the customer to move parts around freely and freely layout the description for Youtube.

The following stats will be collected for review:

  • Products without a video linked (no relation)
  • Products with the same video linked (duplication)
  • Videos without a product in the system
  • Total amount of videos / amount of video-pages (50 videos max per page)
  • Total amount of products with videos

Youtube SEO

These are the relevant key aspects, that help to get your videos more views.

  • Relevant text featuring at least 1,000 characters
  • Keywords that are relevant on Google Search and YouTube and describe your video
  • Hashtags, can also be used within the text
  • Timestamps Links to related content / affiliate links
portalZINE NMN | Development meets Creativity | youtube data api auth

PREPARING AUTH / OAUTH2 AT GOOGLE

In the past access to the Youtube Data API was far easier and less limited, when it comes to offline / none expiring OAuth2 refresh tokens.

When you are building a server-side application that is only available to your customer or moderators, it makes no sense to run that app through the Google App verification. Your app will never be used in public.

The Youtube Data API and its scopes, are defined as sensitive and therefor require third-party security assessment for public access.

The scopes I am requesting are https://www.googleapis.com/auth/youtube.upload + https://www.googleapis.com/auth/youtube.

Because of that its far easier to just setup OAuth 2 in test mode and restrict access to your customer and specific additional accounts only (up to 100 test users allowed). What all these account need, is access to your own or Brand Youtube Channel.

Preparation in the Google Cloud Console:

  • Activate Youtube Data API
  • Create OAuth2 Client ID
    • Name
    • Allowed Redirects
  • Setup OAuth-Consent Screen
    • Name of the App
    • Support Email
    • Logo
    • Allowed Domains
    • Typ: External
    • Setup all test-users, all of these need access to your Youtube channel already

A detailed description can be found here.

You can circumvent verification for the consent screen, by using an organisation setup at Google. Here some infos about that. With that setup offline refresh tokens should work fine.

Update: Just tried that, but wont work with a branded youtube account, even though the cloud user has admin access to it. Not giving up yet, but Google / Youtube really makes it difficult to just have a simple offline solution for specific tasks ;) BTW also forced the login hint, to make sure the right account is logged in : $client->setLoginHint(‚YourWoreksapceAccount‘); !

You might have heard of the „The League of Extraordinary Packages„. It is a group of developers who have banded together to build solid, well tested PHP packages using modern coding standards.

They also offer an OAuth2-client + OAuth2 Google extension that can be used.

SERVER SETUP

On the server, the Google API PHP SDK can be easily integrated using Composer.

In my customer plugin I neatly separated all relevant areas in classes & traits:

  • Online Authentification, with a 60 minute session
  • Offline Authentification, with a 1 week expiration (might explain that in more detail in the future.)
  • Error handling and debug information
  • Listing of videos
  • Updating of videos
  • Updating of tags
  • Updating hashtags
  • Backup old data locally, before update. There should always be an option to restore, especially with 500 videos :)
  • Settings & templates
  • REST routes / endpoints

You can check the expiry time of your access token by accessing:
https://www.googleapis.com/oauth2/v1/tokeninfo?access_token=YOUR_TOKEN

„A Google Cloud Platform project with an OAuth consent screen configured for an external user type and a publishing status of „Testing“ is issued a refresh token expiring in 7 days.“ – Google

Basic Auth example from the SDK:

A simple upload example can be found here .

portalZINE NMN | Development meets Creativity | youtube data api list

UPDATING A SINGLE VIDEO
BULK VIDEO UPDATES

All operations to and from the Youtube Data API are rate limited. What is important for us, are the queries per day.

The default quota is 10.000 queries per day, sounds a lot, but is easily gone after updating 150-200 videos. You can request this limit to be raised, but again a lot of paperwork and questions that are just not needed.

The above limit just means, that you need to cache as many queries as possible, to only query live when needed ;)

Something you learn fast, when experimenting with different things! I hit that limit multiple times in the first few days, with around 500 videos in the queue.

Different operation cost you different amount of units

  • A read operation that retrieves a list of resources — channels, videos, playlists — usually costs 1 unit.
  • A write operation that creates, updates, or deletes a resource usually costs 50 units.
    10.000 / 50 are 200 updates per day :)
  • A search request costs 100 units.
  • A video upload costs 1600 units. Glad we are not handling the upload as well, but technically easy ;)

It also helps to use the Google Developer Playground to testdrive the Youtube Data API with your own credentials while optimising your own code.
You can define your own OAuth 2.0 configuration by clicking the cog in the upper right corner.

I setup the bulk updating to allow splitting it over multiple days, if required. For this an offline refresh token is needed, as the standard token expires after 60 minutes.

My customer can also just update a single video, when changes are applied to the product or a new product has been added.

If more frequent updates are required, I will ask for a raise of the queries per day. You can circumvent the limit by using multiple Google Cloud Platform accounts with new OAuth credentials, but really an overkill right now. I have done that in the past ;)

LAYOUT & DESIGN

The GUI is just based of Bootstrap, to make it simple and clean. Using my own wrapper to make it work within the WordPress admin.

For all ajax operations, I am using htmx and _hyperscript, which I will talk about in another article in the future.

Really neat and clean way to build single page interfaces.

The whole plugin runs of its own REST API endpoint. Just love using WordPress as a headless system.

I used TWIG / Timber for the templates, to separate logic and layout. Timber has been my goto solution for years now. It drives my own and many customer websites.

portalZINE NMN | Development meets Creativity | road ends

CONCLUSION

This has been a lot of fun, maybe a bit too much LOL

I do geek-out about many of my projects, but this experience helped me to bring my WordPress toolbox to the next level. This will help to drive other things in the future.

Working so deeply with the Youtube Data API has been fun and feels so easy now, after all remaining problems have been solved.

Would have loved this during my portalZINE TV days ;)

I you read all this, you just earned yourself a badge for completion ;)

Need something similar or something else? Just say hi and we can talk.

ENJOY CODING ….

readmore

Getting started

„WPML (WordPress Multilingual) makes it easy to build multilingual sites and run them. It’s powerful enough for corporate sites, yet simple for blogs.“ – WPML

I have been running and setting up multilingual websites for more than 12 years. WordPress and related integrations have gladly come a long way to make our life’s a lot easier.

For basic content WPML is almost plug & play, but I do see more and more sites / customers struggling with more complex setups. WPML is one of the most popular multilingual plugins and is used on x00.000 of websites.

Just so you know, WPML is a commercial solution!

Settings for almost everything

The amount of settings has increased a lot over the years and offers possible solutions for almost any content / plugin setup.

But for more complex setups, I would suggest to hire a professional to look over the settings or study the plugin documentation carefully.

Especially with a lot of content, it can quickly increase problems and the need to revisit specific content over and over again.

How to translate

WPML lets you translate any text that comes from themes / theme frameworks (DIVI, Elemetor, Gutenberg …), plugins, menus, slugs, SEO and additionally supported integrations (Gravity Forms, ACF, WooCommerce …).

You can translate content internally for yourself, using translation management to translate with an internal team of translators or get help from external translators / translation services.

The latest version also offers AI translations, which allows you to get a decent start for most of your content.

In addition to the above, WPML String Translation allows you to translate texts that are not in posts, pages and taxonomy. This includes the site’s tagline, general texts in admin screens, widget titles and many other areas.

Is it worth its money

Well, I am a bit biased. I have not looked much at other solutions for the past 5 years, as it offers all I really need.

I have used it on projects from 2 to 15 languages and it scales nicely. At least with proper hosting attached!

Anything can be tweaked through the API, Hooks and custom integrations. I have build additional WPML tools for my customers, to streamline some of the repeating / boring tasks.

Their support is responsive and the forum already provides a huge amount of answers to most of the questions that might come up.

If you develop / maintain multiple customer websites with multilingual content, the investment is quickly
amortized. I do offer WPML to my maintenance package customers, maybe something to consider ;)

Its an essential solution in my WP toolbox.

WPML 4.5

WPML 4.5 is on its way and will include a „Translate Everything“ feature, among other fixes and enhancements.

Translate Everything allows you to translate all of your site’s content automatically as you create it. You can then review the translations on the front-end before publishing.


portalZINE NMN | Development meets Creativity | wpml logo transparent

WPML / Documentation

readmore
18. November 2019

Structured Data – I love it!

You might have heard about Structured Data, Schema.org and JSON-LD.  

Search engines read structured data and use it to enhance search engine results. Structured data helps search engines to understand and categorize page content.

Example

This structured data, in JSON-LD format, describes a simple Article.

Schema.org is a collaborative, community activity with a mission to create, maintain, and promote schemas for structured data on the Internet. But not all structured data endpoints are actually used by Google, Bing or other search engines yet.

Allowed Structured Data

Google provides a detailed overview of structured data allowed and used for search results.

There are basic enhancements you can use, like the Article structured data above. There are also many other more specific uses, like Video, LocalBusiness, Events, FAQ, Job Postings, Recipe and so on. Bing also provides a basic overview, but their documentation is scattered and feels incomplete.

How to integrate Structured Data

If you use a modern CMS, many structured data endpoints are already integrated out of the box (Article, Website, Logo, Person …).

Also modular content management systems often offer additional functionality through plugins, those help integrate structured data directly. Some do it better than others!

But if you really want to dive deep and integrate all those little things, structured data is still far more powerful when added manually. Especially things like events, products, job listings, courses, Q&A can greatly be enhanced by hand.

Alex@portalZINE

Validating Structured Data

Google and Bing offer validation tools for structured data. Both integrate it into their Webmaster Tools. You can also use the JSON-LD Playground to validate the JSON-LD itself or RDFa Play, Structured Data Linter, Facebook Debugger, Schema.org Generator and many other tools.

Need help?

I am a huge structured data fan and have been working with it for years now. I am constantly looking for new supported structured data endpoints, to enhance my own or customer websites & data.

Google constantly updates their documentation and highlights experimental structured data endpoints. Like Speakable for example, that highlights sections of a websites that are best suited for audio playback.

Fresh structured data helps to promote your content and enhance SEO, directly enhancing your discoverability and your search engine position. Your content becomes more meaningful for search engines, making it easier for them to promote it to the right potential user. It also ties into the GO GREEN concept, as you are reducing bounces of your website for users getting offered the wrong content.

Things like recipes and how-tos are already pushed to the top of the search index. A perfect way to promote your website and get noticed.

Have fun using or discovering structured data!

readmore