This article is a condensed version of this article (https://www.windowscentral.com/how-enable-precision-touchpad-drivers), with some clearer defined specifics for the razer blade laptop.
Why switch to these drivers? The biggest disappointment when I bought my blade earlier this year was that it didn’t support precision touch. What this means is all of the new fancy gestures that came with windows 10 don’t work. These fancy gestures make your windows desktop behave a lot like a macbook. I use a macbook at my day job and I use these gestures all the time.
What gestures? Primarily:
- Four finger swipe to switch between virtual desktops. Without this, virtual desktops are pretty useless because it’s an effort to switch. My typical work flow is a few full screen apps across say 4 virtual desktops. Unity, Visual Studio, Git Bash, Chrome.
- Four finger swipe up to show task switcher / list of windows
- Three finger task switching
Precision touch is an effort to standardize how touch pads work on windows 10 across laptop manufacturers. To learn more about precision touch see this article: https://www.howtogeek.com/286905/what-is-a-precision-touchpad-on-windows-pcs/
How to enable Precision Touch on the Razer Blade
- Have a mouse handy, this process will temporarily disable your trackpad.
- Download these drivers if you have a synaptics touch pad: link OR these if you have the Elan touch pad. Just check your touch pad name in windows device manager
- Unzip the drivers
- In device manager, right click on your touch pad and choose update driver -> then ‘Browse my computer for driver software’ -> then ‘Let me pick..’ -> then ‘Have Disk’ -> then browse to the spot you unzipped the drivers and pick autorun.inf
- The trackpad won’t work, repeat step 4 but this time click ‘Search automatically…’
- Open regedit, go to HKEY_LOCAL_MACHINE\SOFTWARE\Synaptics\OEM\TouchPad and set DisableDevice to 1. This is because there will be competing drivers/software and when you resume from sleep it will be using the wrong drivers.
I wrote this article because the original source article was missing step 7 and I only found that in the comments section. Razer still hasn’t put out official support for this but it really is the cherry on top for this laptop. Hopefully they will support it officially soon.
I foresee a “near” future where VR and AR are achieved on the same device. It’s a matter of whether you show the real world or not, but the tracking and rendering would remain the same. I think a dead simple way to achieve this in a prototype is by adding electrochromic glass to a well rounded AR unity.
Let’s start with that and make something more sophisticated after. Let me see the real world, or let me not see the real world. Current AR solutions cannot do opaque pixels so I think an electrochromic filter would be the first step to merge the two platforms.
I’m really excited for a future where the VR world you play in is mapped 1:1 with the real world. So all of the physical walls in your house are remapped with artificial textures. Same geometry but different skin.
If you were to play a game where 100% of what you see fits the physical geometry of your home, but also 100% of what you see are virtual textures, is it Virtual Reality or Augmented Reality at that point?
I feel like it deserves its own word but I dare say the M(ixed) word considering the garbage marketing that Microsoft has used to taint that name.
VR is a stepping stone, and after that I’ll also say AR is a stepping stone. I think VR/AR/MR is a matter of what pixels you see as real or artificial. When you have the fluid choice between real world “pixels” and virtual pixels to comprise your vision you can start to invent and witness some pretty amazing stuff.
I wanted to do a post that covers the different experiments I’ve done in some games I created. I was looking at my past work and realized a common theme. I would take an established game type and attempt to add a new mechanic to it.
Short Range Sensors
Short range sensors is based on a game type that has existed for decades. Swarms of enemy ships coming towards you. Defending yourself and earth. The earliest game of this type might be space invaders.
Experiments in this game:
- 3D game play. An inspiration for this game was a game called Incoming which was released around the Voodoo 2 era.
- Adding 360 degree combat. In my game you are defending not only yourself but also the planet earth behind you. If either you or earth is destroyed, the game is lost. Placing the player in between the source of attack and earth means that while defending both targets, you are vulnerable to attack when facing towards earth while defending it.
Ball and paddle games date back to pong and games like Bricktastic are based off a game type that might have started with Arkanoid or Breakout.
Experiments in this game:
- 3d game play with touch screen or mouse controls.
- Animated / moving bricks
- Infinite runner style, continuous game play: I wanted to make a game like this with no level loading. Every time a level is completed the back wall opens and you animate into the next room.
- Moving bricks in different patterns: I wanted to avoid the issue seen with static bricks, which is your ball stuck in a pattern that takes too much time and finesse to get out of to finish the level.
- VR: Seemed like a good idea at the time, but the animation of loading new levels was kind of nauseating. You can still try it on cardboard however: https://play.google.com/store/apps/details?id=com.Daggasoft.BricktasticVR
Super Mario Breakout
Experiments in this game:
- This game isn’t just a breakout clone skinned with mario sprites. The experiment in this game was to bring the functionality of the mario question blocks into a ball and paddle game. I think this served as a very fun way to gain power ups and added a new level of challenge.
- Using a turtle shell instead of a ball could be a superficial sprite change but effort was put in to make the shell behave like it would in a mario game. Considering how a shell can be used in a mario game this also feels appropriate.
Experiments in this game:
- The LCARS system in star trek is touch controlled. At the time of VR when I made this game there were no touch controllers. To use the panels in the shuttle craft I used look gazing to be able to select individual buttons on the panel.
Roof Runner VR
Experiments in this game:
- Roof runner vr was an experiment with building endless semi-random levels as used in infinite runners, while also bringing what is traditionally a 2D side scroller into a 3D virtual reality experience.
Experiments in this game:
- Peripheral originally started out as a virtual reality game but ended up being more entertaining as a traditional first person game in the vein of portal.
- The core mechanic revolves around objects in the game world being affected depending on if you look at them.
Experiments in this game:
- Bringing traditional CCG/TCG into a 3D world
- Using motion controllers to interact and use cards
- Real world scale of summoned minions
- Player presence in the battle field
- Motion controllers to attack and defend using hero powers
We Are Live
Please purchase our game on Steam here: http://store.steampowered.com/app/548560/
Please check back on this blog for futher development updates!
Latest trailer as we are about ready to go live in Steam Early Access http://store.steampowered.com/app/548560/ A lot has changed since October 23rd and more to come!
Where do I even begin with this update? I’m amazed at the crazy amount of work that I was able to get done in just a few weeks in order to make the deadline for the MSI VR Jam.
Above all else, I’d mention the animated cards. Each card has an image of the spell or minion it will summon. When you pick up the cards they begin to animate showing what the spell or minion does. I’m using OGV video played on movie textures to accompish this. Having all videos play at once (every card on the table) has a negative performance impact so I wrote a script to only play the video when the card is in your hand. When the card is on the table it shows a paused first frame of the video. It seems to work really well and was a lofty strech goal that I didn’t think I’d have time for. I think it’s a cool touch to the game.
Secondly all of the hero powers work. This was another stretch goal. They will receive attention and improvements in future builds, especially the shield, but once again I didn’t think I would get these to work well in the amount of time given. You can cast energy beams from your controller to deal damage, use the bow and flaming arrows to deal fire damage and use the animated shield to protect yourself from attacks. I have a handful of other spells like the energy beam that I will be bringing into the game.
The system for using the controller to select minions and have them attack, as well as the over all system for doing minion attack sequences and enemy AI was a scary thought. It was tough to get it to work well and to feel proper. I feel like I was able to get something to work well and can be tweaked in future builds. The same is true for the HUD / stats display. It’s a good foundation that can get better with time.
Lastly another scary thought was game balance. In early builds of the game it was far to easy to win or far to easy to lose. It had some serious flaws. I believe that now I have the core game rules locked down which favours an even chance of winning or losing, leaving it up to the players instinct and strategy to be able to win. Going forward it makes it easier to massage the balance by changing card properties (cost, health, attack, buffs). I think it’s a good first stab at having it play how a card game should.
Mana Storm is virtual reality, player vs player, spell casting game. It uses familiar game mechanics seen in magic card games and table top rpg games. Including cards, dice, sand timers, etc.
The player can spawn minions, cast spells to add buffs such as taunt, shield, charge etc. The player can also cast spells to use hero powers such a fire and ice beams.
As an alchemist the player exists in a room scale laboratory / battle station. The player will have to walk around between different crafting and playing tables, pick up and combine materials to create different enchantments and spells.
Week 4 update
Almost too much to list, getting excited:
- Full battle sequence working and ability to win game
- Enemy UI
- Basic End Turn switch on the table that the player can press
- new minions and animation
- hero powers, archer bow, warrior shield, mage fire ball
- Sound effects
- tutorial and key mappings
- stats ui for minion and player health
- global game manager queue to orchestrate game play
- much more
Week 2.5 update
Week 2 update
Great progress so far.
- Full vive controller and head set support.
- Reading cards from a database.
- Dynamically loading card images and spell or minion game objects.
- Animations for spawn, idle and attack sequences
- Beginning enemy AI
What you are seeing in this post was created in two days. We expect much more to come in the following weeks.
As a team we met up at 6pm on a Friday night. We hashed out the game mechanics on a white board, sat down and produced this after about an hour.
In non-chronological order, this is the further progression of the game that night, and part of the following day. (these are all GIFs)
At this point the player can interact with many objects in the game, we have a very basic test battle field. The player can draw cards, pick them up, cast them and see minions spawn. Minions with charge demonstrate their attack.
As of writing this the plan of attack tomorrow is to wire up a more sophisticated card management system that was written over the weekend, and integrate the Vive SDK. This should be minimal effort based on past experience but some time will have to be invested into scaling all of the in world game objects properly.
(these are static images)
We created a ‘photobooth’ scene to take images of our minions/spells to then be placed on the cards.
SORRY ABOUT THE ADS 😦 THAT WURDPRESS NOT ME
The short story is, I created a tripod adapter for my VR headsets.
This improves my development flow. I talk about it at length in my youtube videos below. Basically when you’re designing levels and scaling objects in that level you’re constantly dipping in and out of VR and it’s a brutal work flow. Most of the time when doing this you don’t care about head tracking, and you actually want to keep the headset fixed on a position in the world that you’re working on.
Using a tripod mount you can fix the view to a specific area, and you can set the headset at a comfortable hands-free position. You can see your computer monitor and VR view at the same time if you remove the shroud. It also makes it easier to move the headset smoothly in VR space, while debugging on your computer monitor, without having the view go all jittery as it does when you just hold it in your hand.
I now have a working design for the HTC Vive and I love it. I’m considering doing a manufacturing run if enough people want one. For now if you want to order a 3D printed one, with foam padding and a tripod compatible nut, send me a message. @daggasoft.
Today we did our first public convention, Maker Expo KW 2016. It was a local event taking place in Kitchener – Waterloo, at city hall. There were robots, 3d printers, and LEDs everywhere. It was very cool. There was an estimated 10,000 attendees.
I was a little worried at first since I didn’t know what to expect. Would people play our games or would that just glance and walk by? Would they play them and not like them? After all, this was our debut in the sense of getting real time feedback in person. Uncharted territory.
All of my worries washed away when person after person sat down, began to play, and actually played for a considerable amount of time. We demonstrated two games. Super Markup World and Peripheral. Both were prize winning games at global game jams this year. Peripheral winning 1st place and Super Markup World getting 4th.
So all in all, a great time. I do wish that more adults sat down to try the games. I also wish that more people followed us on twitter afterwards. We will have to figure out how to do a “call to action” better next time.
We won a hackathon for our game. The publishers who hosted the hackathon really like the game concept. But the harsh reality is that our primitive graphics are just not good enough for prime time. In fact they are so primitive that that the OSVR fund won’t give us any money so that we can then hire an artist.
So what does one do with zero budget, no funding and no artistic talent? You experiment.
Here is what peripheral originally looked like compared to 3 very simple experiments with edge detection, colour correction and grayscale.
This one is pretty boring
Here were are after adding some simple textures, with normal maps for a bit of depth on the lines. We like this a lot, keeps it simple but adds some much needed polish.
Here we are after further refinement with Bloom, Edge Detection, Color Correction, and a few others. Personally I’m a fan of the bright, loud colours. However I think the combination of that along with the bloom is washing the scene out too much.
All in I think it’s safe to say that the graphics still need some work but it’s pretty impressive how much polish you can add to a game with some free and built in effects and textures. And without changing the existing models.