Skip to main content

Aframe Systems and Networked-Aframe

This week saw less direct contribution to the project, and had a greater testing and learning component to it.  Most of my time in the past week was devoted to a different Aframe project that involved a multi-user interaction with a cooperative and competitive component.  I used this project as a guinea pig to test out Networked-Aframe and some more 'advanced' Aframe design patterns (~25 hours).

Networked-Aframe is a framework built on top of Aframe that integrates WebSockets and component syncing to streamline the process of creating multi-user Aframe projects.  The main feature of Networked-Aframe we are interested in was the voice chat support, as we feel it could be a great feature to have for playing the game with friends remotely.  However, I also set out to see if there are other features in the framework that could be useful in developing our project.  An element I did find useful was component syncing, which uses HTML <template/> tags to create entities that have their components synced across all clients.  Only the 'position' and 'rotation' components are synced by default, but implementing custom components is quite simple, as shown below.

NAF.schemas.add({
  template: '#avatar-template',
  components: [
    'position',
    'rotation',
    'scale',
    {
      selector: '.hairs',
      component: 'show-child'
    },
    {
      selector: '.head',
      component: 'material',
      property: 'color'
    },
  ]
});  

The above code, taken from the Networked Aframe GitHub, shows the implementation for syncing the custom 'show-child' component, as well as the 'material' component.  However, due to the nature of our project, where each player will be in their own environment and will not need synced entities, using Networked Aframe just for voice chat may be overkill.

Finally, I also took the time to experiment with the Aframe systems element.  Systems provide overarching management and services to the entire application, and they are often used to manage the game state or NPC elements of games.  These systems rely on events to communicate with various components in the application.  My first attempt was not wholly successful, but I figured out a lot of does-and-don'ts for when it comes time to implement such systems in Build-A-Furniture

Comments

Popular posts from this blog

[WEEK 1] Introducing our project...

Our goal is to make a cool VR game for Design Studio 3. The main idea involves a collaborative asymmetrical experience to build furniture virtually. There will be two roles in this game: a finder (to look for furniture pieces in the warehouse), and a builder (putting the parts together). We started this project on January 22, 2020 and are currently on our first 1 week sprint of development.

Sprint 10 - Adding more boxes and lots of scripting

As the final submission draws near, lots of work has yet to be done. Due to time constraints and the lack of resources in light of recent events, we made the decision to cut down our scope by removing VR functionality entirely and focus on desktop-to-desktop connection fully. With new goals in mind, I spent the beginning of the week by adding all the boxes for spawning furniture components. To do so, I started by replacing the blue boxes we used previously with stylised boxes that match the environment better. To tell the player what each box contains, an image of the rendered component is placed on each side. The challenge here was that I wanted to avoid creating a GLTF for every single box because it would have slowed down the page drastically. The solution was to instead use a single, universal GLTF for every box and placed images on each side of the box as explained previously (~6hrs). New Warehouse Area - Added new boxes Close up of updated box - Bright colours and side

Storyboard and Physical Layout

I finished up on some graphical elements for the user interaction specification component of the proposal due this coming Friday. This includes the storyboard panels and the physical layout diagram. As I was researching Oculus Rift physical setups, I had to determine how many sensors we would need for our game. I believe that 2 sensors will be sufficient, since we do not need a true 360 degrees experience as the Builder player will primarily be focused on the 180 degree space in front of them (i.e. the fireplace, the TV, and building the furniture). Our game is not an action packed game with any running or shooting. Of course, the player will still be able to fully look around but they shouldn't have a great need to move in the other 180 degrees of space. This would also take into consideration accessibility to our game, because it costs extra to buy a third sensor (the Rift only comes with 2) as well as requiring adapters and wire extensions. I spent about 4 hours researching