VR has not just arrived, but it finally arrived for me and the masses ;)
I have been watching VR evolving from the sidelines for the past few years. Its been a fun ride, from the first prototypes to what we have now.
The biggest problem in the past, has been image quality and the huge upfront investment for me. With the latest generation all of this has completely changed.
I am constantly keeping up with new technologies and have been diving into WebVR for some time now.
Its so easy to export your own simple Unity VR project into WebVR and integrate it into your own web projects.
Unreal Engine also provides options to work on VR projects. Or build upon the WebXR API.
Microsoft also has a foot in the door, with MRTK (Allows to integrate teleporting easily / Releases).
With the announcement of the Oculus Quest 2 last year, I finally decided to dive in myself. Standalone VR allows me to concentrate on WebVR and experiment, while still having the option to expand into the linked PC-universe in the future as well.
I have no plans to invest into a beefy gaming rig yet, but have been trying out cloud solutions using Shadow and Paperspace Gaming / Paperspace $10 Coupon (KS4Q2TA).
Update: Latency is the biggest problem with these solutions and it can be a hit & miss! My Paperspace experience was close to perfect for pure desktop experiences, for VR sadly not :) Can not testdrive Shadow right now, as the waitinglist is already showing September 2021 LOL CRAZY! Local latency can be greatly optimized by using Wi-Fi 6 or Wi-Fi Direct.
With the current GPU prices, building a machine makes hardly any sense. I would love to build a small form-factor PC with a Shuttle XPC for example ;) Maybe later this year ….
I invested in VR-Comfort for now ;) I transformed the Oculus Quest 2 into a FrankenQuest with the DAS-Mod (HTC VR VIVE Deluxe Audio Strap). Loving the new look, audio and perfect weight balance :) Perfect for longer sessions.
I am officially infected. I decided to upgrade an old computer to minimum VR specs, to at least tryout PC-VR :)
The final build is ghetto and really a tight fit, but it works perfectly :)
For 2021 and the current shortages, this is a big win! The whole upgrade was about 400 EUR, with 250 EUR for the new GPU.
Here are my current PC-VR specs:
Area | Before | After / Comment |
---|---|---|
PC | Fujitsu P900 – i3 | Nice clean case. Mainboard D2990 (ultra small μATX). Really tiny! This would normally not be my dream mainboard, but I am using what I have :) Trying to keep costs low. |
Power Supply | Stock | EVGA 600 W1. Better cooling and needed power for the GPU. |
CPU | i3 – 2120 | i7 2600K . Big change in overall performance. |
CPU Cooler & Fan | Stock | Be Quiet Pure Rock Slim BK030. (Had to do some mods to install it) |
RAM | 8GB | Enough for now. |
GPU | NVIDIA 1030 GTX – Low profile | NVIDIA 1060 6gb Inno3D. That card takes up all slots, had to add a riser card to play with some USB 3.0 cards :) |
USB3 | USB2 | Inateck PCIe USB 3.0 – KTU3FR-4P , again connected via a riser card! Make sure that the card gets proper power (green light on the card itself), that is why another Inateck card failed to connect or had random disconnects ;) That card charges and connects perfectly with the Oculus Quest 2. I was almost giving up and glad I found a working solution ;) |
USB LINK | Using a 3m long cable from KIWI Design, works without any problems and has secure fit on the Oculus Quest 2 | |
Bluetooth | – | Bluetooth 4.0 – Asus BT-400 |
All that is required, has been added and now the PC VR universe is open for exploration :) As hardware is getting more expensive every week, I updated a second machine with comparable specs and have 2 machines that can run VR with entry /decent specs :)
There are many new platforms providing access to new tools and often an easy access to a broader community. Some of them with nice build tools in VR.
“OpenXR is an open, royalty-free standard for access to virtual reality and augmented reality platforms and devices. It is developed by a working group managed by the Khronos Group consortium.”
You will find a bunch of efforts on the way, to build the next open multi-platform VR solution.
Building with Unity is always an option, but not the best solution for those that are just getting started or for those that just want a simple starting point to experiment ;)
Another evolving area is the office space. Some of the platforms above already dive into that area, like XRDesktop. But Oculus / Facebook itself is working on its Infinite Office integration.
Other solutions help to mirror your PC within VR and open new ways for collaboration:
The biggest problem is the mirroring of the keyboard. As soon as that is solved, this might become usable. Immerse VR provides an option to overlay a virtual representation of your keyboard, by mapping your real-life keyboard in VR.
Always important to stay informed. Here some communities are frequently visit:
VR is here to stay, I would have never though it would take off 2020 / 2021. But we all face new challenges and technology is evolving to make space for new possibilities.
While gaming / fitness / social are the entry point for VR currently, this whole market will expand quickly in 2021.
Really looking forward to new possibilities and another facet of my developer life.
Looking forward meeting some of you in the VR-Multiverse :)
I am always looking for easy ways to white label the WordPress administration for myself and my clients. A nice personal touch for each project and an easy way to declutter the interface.
These are my personal favorites, that I use on a regular basis.
There are a lot of solutions out there, but many break easily and are really heavy to load. Some of these solutions I tried also break easily on new WordPress Upgrades. The first two below are currently my favorites.
When sharing the administration with your customer, you often need to make it as simple a possible for them. Depending on your setup, the menu becomes cluttered and overwhelming really fast.
I often trim menus for each user role, to make only those options accessible that are really needed.
When sharing the administration with multiple users, its always nice to add some personality to the user profiles as well.
WP User Profiles
“WP User Profiles is a sophisticated way to edit users in WordPress.”
The plugin provides other small addons, like WP User Avatars. Neat plugin to tweak admins, editors and other users.
Enjoy
Alex
You might have heard about Structured Data, Schema.org and JSON-LD.
Search engines read structured data and use it to enhance search engine results. Structured data helps search engines to understand and categorize page content.
This structured data, in JSON-LD format, describes a simple Article.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 |
{ "@context": "http://schema.org", "@type": "Article", "author": "John Doe", "interactionStatistic": [ { "@type": "InteractionCounter", "interactionService": { "@type": "WebSite", "name": "Twitter", "url": "http://www.twitter.com" }, "interactionType": "http://schema.org/ShareAction", "userInteractionCount": "1203" }, { "@type": "InteractionCounter", "interactionType": "http://schema.org/CommentAction", "userInteractionCount": "78" } ], "name": "How to Tie a Reef Knot" } |
Schema.org is a collaborative, community activity with a mission to create, maintain, and promote schemas for structured data on the Internet. But not all structured data endpoints are actually used by Google, Bing or other search engines yet.
Google provides a detailed overview of structured data allowed and used for search results.
There are basic enhancements you can use, like the Article structured data above. There are also many other more specific uses, like Video, LocalBusiness, Events, FAQ, Job Postings, Recipe and so on. Bing also provides a basic overview, but their documentation is scattered and feels incomplete.
If you use a modern CMS, many structured data endpoints are already integrated out of the box (Article, Website, Logo, Person …).
Also modular content management systems often offer additional functionality through plugins, those help integrate structured data directly. Some do it better than others!
But if you really want to dive deep and integrate all those little things, structured data is still far more powerful when added manually. Especially things like events, products, job listings, courses, Q&A can greatly be enhanced by hand.
Alex@portalZINE
Google and Bing offer validation tools for structured data. Both integrate it into their Webmaster Tools. You can also use the JSON-LD Playground to validate the JSON-LD itself or RDFa Play, Structured Data Linter, Facebook Debugger, Schema.org Generator and many other tools.
I am a huge structured data fan and have been working with it for years now. I am constantly looking for new supported structured data endpoints, to enhance my own or customer websites & data.
Google constantly updates their documentation and highlights experimental structured data endpoints. Like Speakable for example, that highlights sections of a websites that are best suited for audio playback.
Fresh structured data helps to promote your content and enhance SEO, directly enhancing your discoverability and your search engine position. Your content becomes more meaningful for search engines, making it easier for them to promote it to the right potential user. It also ties into the GO GREEN concept, as you are reducing bounces of your website for users getting offered the wrong content.
Things like recipes and how-tos are already pushed to the top of the search index. A perfect way to promote your website and get noticed.
I signed the Sustainable Web Manifesto a couple of weeks ago. The manifesto perfectly reflects how I have been handling my business and my projects.
I created a special “GO GREEN” subsection to talk about the topic in more detail and give you some more context about the areas I can help you with.
We all share and use the web, just as we all share and live on this planet. This manifesto is a public declaration of a shared commitment to create a sustainable internet.
https://www.sustainablewebmanifesto.com/
“If we embrace sustainability in our work, we can create a web that is good for people and planet.”
Together with my partners in crime (Dorit & Micha), we have finally opened our own personal online store.
We have been selling our single origin coffees (1st Single Malt Whisky Coffee, Basic – Single Origin Arabica, Kill me Quick Espresso -Single Origin Robusta), teas (Kräuterschorle – Kräutertee, Feuerkieker – Schwarztee) and rum (Fortune Teller – Double Aged Barbados Rum) using the Amazon Marketplace for the past 2 years.
GreenApe has been a side project for the past years and I never wanted to deal with the maintenance of our own store. But its time to move on and do our own thing. Amazon has removed so many useful features over the years or added a new fee on top of other fees. Even though Amazon provides access to a large amount of customers, for small companies the fees build up quickly.
With our own store we can finally do bundles, coupons again and better optimized shipping. It will also allow me to better testdrive some new interesting features for my customers ;) Yeah its kind of my new toy or shopping lab! Its fun being able to work on untested new SEO features, structured data, merchant tools, shopping ads and tracking of all of those.
We have been selling in Germany for the past 2 years, but that might be changing in the future depending how well the new store shapes up :)
If you live in Germany, love good coffee, tee or rum … say Hi!
GreenApe – Makes Your Life Better
Homepage
Shop
Contact us
Development today relies on multiple teams, services, and environments all working in unison. A topic that always comes up, when setting up a new development environment: How do we secure important credentials, while not making it too complicated for the rest of the team?
The key when working with version control systems like Git, is to keep any type of credentials out of the versioning system. These can be API keys, database or email passwords.
Even if its a private repository, development environments might change. It can be a simple staging & live website setup you are maintaining.
1 2 3 |
DB_HOST=localhost DB_USER=username DB_PASS=password |
The simplest way in PHP is to use .env files to store your credentials outside of the public accessible directory structure. So outside the public_html, but still within the reach of the executing environment to read it. Variables are accessible through $_ENV['yourVar']
or getenv("yourVar")
, once included in your code.
To make it simple you can use the popular package vlucas/phpdotenv, which reads and imports the file automatically.
1 2 3 4 5 6 |
<?php require_once __DIR__.'/../vendor/autoload.php'; $dotenv = new Dotenv\Dotenv(__DIR__.'/../'); $dotenv->load(); ?> |
Don’t fool yourself, if an attacker finds a way into your system, these variables can be easily read. This is just hiding the file from public access and provides some convenience while developing or sharing code.
Some people propose to encrypt / decrypt environment variables using a secret key. But if an attacker can access your data, he can also find the secret key.
There are some nice packages that offer just that. You have to decide if those fit your ammo.
psecio/secure_dotenv
library provides an easy way to handle the encryption and decryption of the information in your .env
file. @Githubjohnathanmiller/secure-env-php
– Env encryption and decryption library. Prevent committing and exposing vulnerable plain-text environment variables in production environments. The lib provides a nice guided interface to encrypt your .env file. @Github beyondcode/laravel-credential
– Add encrypted credentials to your Laravel production environment. You can edit and encrypt using php artisan credentials:edit
. @GithubThe Apache2 environment variables are set in the /etc/apache2/envvars file. These variables are not the same as the environment variables of your Linux system; they are stored and manipulated in an internal Apache structure.
The /etc/apache2/envvars file holds variable definitions such as APACHE_LOG_DIR (the location of Apache log files), APACHE_PID_FILE (the Apache process ID), APACHE_RUN_USERS (the user that run Apache, by defaultwww-data), etc.
You can open and modify this file in a text editor of your choice. This is nice, but far from simple and requires a server restart. This is something which helps you when hardening security on a live deployed setup.
There are dynamic approaches, but you can do some research for that yourself :) Skipped that rabbit hole for now …
Handling secrets completely detached is another possibility. This is surely an overkill for most cases, but using an Infrastructure Secret Management concept might be worth looking into, if you are working on bigger scale projects that involve multiple development teams and setups. These services also often deal with secret rotation.
HashiCorp Vault – “Vault is a tool for securely accessing secrets. A secret is anything that you want to tightly control access to, such as API keys, passwords, certificates, and more. Vault provides a unified interface to any secret, while providing tight access control and recording a detailed audit log.”
You can deploy your own vault on your own infrastructure or test out a hosted version, which is free for Open Source projects. HashiCorp Vault
You will find a bunch of Hashicorp related packages that will help you to integrate a vault into your project workflow (scmrus/php-vault-env
, poc-webapp-vault
).
While this is nice, you will need to cache / store credentials somewhere, as you don’t want to query the vault on every single access.
The Hashicorp Vault is not the only Infrastructure Secret Management solution. There is a nice Github Gist that lists other solutions and a nice feature matrix.
Amazon also provides a solution called AWS Secrets Manager, which makes a lot of sense, when you build and deploy on AWS already :)
I will use this article to collect interesting tips and tricks about using the Linux cron. This is not so much about setting up a cron, but about little things I use or discovered!
The cron daemon is a long-running process that executes commands at specific dates and times. You can use this to schedule activities, either as one-time events or as recurring tasks.
For commands that need to be executed repeatedly (e.g., hourly, daily, or weekly), you can use the crontab command. The crontab command creates a crontab file containing commands and instructions for the cron daemon to execute.
Format is: MIN HOUR DOM MON DOW CMD
Minute field
Hour field
Day of month
Day of week
Command
1 2 3 4 |
crontab -l # Viewing the cronjobs as currently logged in user crontab -e # Edit the cronjob for currently logged in user crontab -l -u $USER # View the cronjob for the specified user crontab -e -u $USER $ Edit the cronjob for the specified user |
Run every 5 minutes
1 |
*/5 * * * * /home/reggaenights/script.sh |
Run yearly, monthly, weekly, daily or on reboot.
@yearly will run at 00:00 on Jan 1st for every year.
@monthly will run at 00:00 on 1st of every month.
@weekly will run at 00:00 on starting of every week.
@daily will run at 00:00 on every day.
@reboot will run after the server has been rebooted
1 2 3 4 |
@yearly /home/reggaenights/script.sh @monthly /home/reggaenights/script.sh @weekly /home/reggaenights/script.sh @daily /home/reggaenights/script.sh |
1 |
*/30 * * * * /bin/bash /cleanup | /usr/bin/mail -s "Notify me" your@email.org |
A real cron does not rely on website activity and executes independently.
1 |
*/30 * * * * /usr/bin/wget -q -O - https://yourwordpress.org/wp-cron.php?doing_wp_cron |
Do not forget to disable the virtual WordPress Cron in the wp-config.php!
1 |
define('DISABLE_WP_CRON', true); |
1 |
*/30 * * * * /usr/bin/wget -q -O - https://yourcrone.org/wp-cron.php?doing_wp_cron >/dev/null 2>&1 |
1 2 3 4 5 6 7 8 9 10 11 |
# Email to send output to MAILTO="a@b.com,b@b.com" # Setup your path for reuse PATH="/usr/bin:/sbin:/bin" # Tells which directory the cron should execute the crontab commands from HOME="/path/to/app/root" #Set the default shell SHELL="/bin/bash" |
1 |
*/15 * * * * /home/reggaenights/script.sh >> /home/collect/cron/output/pipe.log 2>&1 |
Gatsby is a free and open source framework based on React that helps developers build blazing fast websites and apps.
While researching some popular static site generation tools, GatsbyJS comes up often. I have played with NuxtJS and Hugo in the past, but what I REALLY like about GatsbyJS is the plugin / modular system. You can build your website with plain-old React and CSS styles, but make your development more efficient by adding node_modules.
Also being able to import any data source with ease, using GraphQL, is amazing. And when it comes to content management, you can easily hook a headless WordPress or Drupal setup into the mix and consume their REST APIs :)
I am not switching my own website to GatsbyJS anytime soon, but its another tool in my toolbox for future project consideration !
There are many tutorials on Youtube about getting started, maybe something to consider for the next freetime testdrive ;) Enjoy …
Manet is a REST API server which allows capturing screenshots of websites using various parameters.
The Node.js server can use SlimerJS or PhantomJS as headless browser engines.
I have build similar with CasperJS, but this is far better for those that want a simple straight solution.
Since I started in 2002, all iterations of portalZINE have been pure english content websites. You can read about the why on my services page.
I had potential customers in Germany complain about that a lot over the past few years. But your own website often suffers, while your customers get all the attention. That is how it is and how it should be!
Creating Multi – Language websites has been part of my services & portfolio for years, with an extreme application setup handling 13 languages in 2014 for the soccer world cup.
Multi-Language setups have come a long way and it was time to showcase that on my own setup as well. Not only to calm those potential customers, but to testdrive new functionality and possibilities on my own setup. portalZINE has always been my testlab for stability and new feature sets.
Most of my static pages are available in English and German now, the blog itself will remain pure English.
Need help setting up a multi language website, get in touch!
Cheers
Alex