{"id":517,"date":"2016-09-02T08:26:42","date_gmt":"2016-09-02T08:26:42","guid":{"rendered":"http:\/\/magiteker.com\/?p=517"},"modified":"2016-10-24T22:51:19","modified_gmt":"2016-10-24T22:51:19","slug":"unity-synthesizer","status":"publish","type":"post","link":"https:\/\/magiteker.com\/index.php\/2016\/09\/02\/unity-synthesizer\/","title":{"rendered":"Unity Audio Wave Tutorial"},"content":{"rendered":"<p style=\"text-align: justify;\">Games are dynamic systems responding on different types of data inputs to create interesting results for the player. One such source of data is Audio which can at its best help the player have an emotional connection to what&#8217;s happening on screen. So with this in mind utilizing audio is many different ways can improve the level of engagement for the player. This short tutorial will explain how to utilize audio data in game to create a basic synthesizer effect giving the sound design a graphical effect for the player.<\/p>\n<h1 class=\"post-title\" style=\"text-align: center;\">Setup<\/h1>\n<p><img loading=\"lazy\" decoding=\"async\" data-attachment-id=\"537\" data-permalink=\"https:\/\/magiteker.com\/index.php\/2016\/09\/02\/unity-synthesizer\/syntheffectobject\/\" data-orig-file=\"https:\/\/i0.wp.com\/magiteker.com\/wp-content\/uploads\/2016\/09\/SynthEffectObject.png?fit=467%2C542&amp;ssl=1\" data-orig-size=\"467,542\" data-comments-opened=\"0\" data-image-meta=\"{&quot;aperture&quot;:&quot;0&quot;,&quot;credit&quot;:&quot;&quot;,&quot;camera&quot;:&quot;&quot;,&quot;caption&quot;:&quot;&quot;,&quot;created_timestamp&quot;:&quot;0&quot;,&quot;copyright&quot;:&quot;&quot;,&quot;focal_length&quot;:&quot;0&quot;,&quot;iso&quot;:&quot;0&quot;,&quot;shutter_speed&quot;:&quot;0&quot;,&quot;title&quot;:&quot;&quot;,&quot;orientation&quot;:&quot;0&quot;}\" data-image-title=\"SynthEffectObject\" data-image-description=\"\" data-image-caption=\"\" data-medium-file=\"https:\/\/i0.wp.com\/magiteker.com\/wp-content\/uploads\/2016\/09\/SynthEffectObject.png?fit=441%2C512&amp;ssl=1\" data-large-file=\"https:\/\/i0.wp.com\/magiteker.com\/wp-content\/uploads\/2016\/09\/SynthEffectObject.png?fit=467%2C542&amp;ssl=1\" class=\" wp-image-537 alignright\" src=\"https:\/\/i0.wp.com\/magiteker.com\/wp-content\/uploads\/2016\/09\/SynthEffectObject.png?resize=353%2C409&#038;ssl=1\" alt=\"SynthEffectObject\" width=\"353\" height=\"409\" srcset=\"https:\/\/i0.wp.com\/magiteker.com\/wp-content\/uploads\/2016\/09\/SynthEffectObject.png?resize=441%2C512&amp;ssl=1 441w, https:\/\/i0.wp.com\/magiteker.com\/wp-content\/uploads\/2016\/09\/SynthEffectObject.png?w=467&amp;ssl=1 467w\" sizes=\"auto, (max-width: 353px) 100vw, 353px\" data-recalc-dims=\"1\" \/><\/p>\n<p style=\"text-align: justify;\">The first step is to create an empty game object in a unity scene. On this game object add a line renderer component, setting the options as follows, turn cast shadows off, receive shadows off, motion vectors off, materials set to 1, in parameters set the start width and end width to 0.2, and finally turn world space off. These\u00a0settings will allows our game object to graphically render our audio data.<\/p>\n<p style=\"text-align: justify;\">Next a material must be created and assigned to the line\u00a0renderer, Unity recommends utilizing a particle shader for line renderers, as such I use the Particles\/Additive shader in my line material. Once the material is created assign it to the game object&#8217;s line renderer materials, it will appear on the object once assigned to the component.<\/p>\n<p style=\"text-align: justify;\">Last make sure to assign an Audio Source to the\u00a0game\u00a0object,\u00a0or onto another object in the scene, this will allow us to play audio files which can then be sampled into our script.<\/p>\n<p style=\"text-align: justify;\">This completes the bare bones of our game object next is the creation of the script which will drive the object in the scene, at this point there isn&#8217;t much appearing in the scene but that will all change once the script is implemented.<\/p>\n<h1 class=\"post-title\" style=\"text-align: center;\"><\/h1>\n<h1 class=\"post-title\" style=\"text-align: center;\">Script<\/h1>\n<p>&nbsp;<\/p>\n<p style=\"text-align: justify;\">For code cleanliness I&#8217;ve split the script tasks into two separate components, one which collects the audio data and another which manipulates the line renderer&#8217;s vertex positions. Making these two different components will allow the audio data to manipulate different effects as desired as well as help avoid redundant operations within our game.<\/p>\n<h2 style=\"text-align: center;\"><strong><span style=\"color: #00ccff;\">Audio Sampler<\/span><\/strong><\/h2>\n<p style=\"text-align: justify;\">The Audio Sampler script will contain the audio data sampled during a single frame of the game. In order to accomplish this the script requires an array of float values to store the audio data,\u00a0<strong>this array must be a length of a power of two (i.e. 2 4 8 16 32 ect) since it&#8217;s sampling frequency values during a frame.\u00a0<\/strong>We can use the OnValidate Monobehaviour to ensure our array length value is a power of two with the boolean bitwise operation:<\/p>\n<p style=\"text-align: center;\"><strong>(x &gt; 0 &amp;&amp; (x &amp; (x &#8211; 1)) == 0)<\/strong><\/p>\n<p style=\"text-align: justify;\">This statement uses a binary and operation between an int x and x &#8211; 1 which is valid for the desired powers of two. The first check on x is only to ensure x is not a negative or zero which would give us erroneous results. Finally above the int\u00a0field we&#8217;ll place the unity Range Attribute\u00a0<strong>[Range(2, 1024)]\u00a0<\/strong>to further ensure our field cannot be set to any values that might cause problems within the unity editor.<\/p>\n<p style=\"text-align: justify;\">The next field to create is an array to hold our sampled audio data. Again this array will be sized using the previously mentioned int array length value. This array will be initialized in the Monobehaviour <strong>Start<\/strong> method to ensure it&#8217;s ready to store data on the first Update call.<\/p>\n<p style=\"text-align: justify;\">The script will also require a Fourier Transform Window which is used to prevent samples from leaking, meaning our script would miss some audio data, Unity provides a data type <a href=\"http:\/\/docs.unity3d.com\/ScriptReference\/FFTWindow.html\">FFTWindow<\/a>\u00a0which provides different types of transforms delectable via an enumerated value, the most accurate being Blackman but feel free to play with the different types based on the effect you want your data to drive.<\/p>\n<p style=\"text-align: justify;\">The <strong>Update<\/strong> method, which triggers every frame, needs only contain the code to populate our initialized float array. To accomplish this we access a static method in the <a href=\"https:\/\/docs.unity3d.com\/ScriptReference\/AudioListener.html\">AudioListener <\/a>class\u00a0 GetSpectrumData with parameters (float[], int, FFTWindow). This method will take our array and fill it with audio data, based on the audio channel int value and the Fourier Transform provided, then return the array to our AudioSampler.<\/p>\n<p style=\"text-align: justify;\">Finally we&#8217;ll create a public method <strong>GetAudioData<\/strong> to access the float array data given an int index parameter. Take care to guard against int values larger than the length of your array and smaller than zero to prevent index out of bounds exceptions then return the accessed float value referenced by the validated index.<\/p>\n<p style=\"text-align: justify;\">The complete code for this component is provided below.<\/p>\n<p style=\"text-align: center;\">\n<pre class=\"brush: csharp; title: ; notranslate\" title=\"\">\r\npublic class AudioSampler : MonoBehaviour\r\n{\r\n \r\n    public static AudioSampler Instance;\r\n\r\n    &#x5B;Range(2, 1024)]\r\n    int arrayLength = 1024;\r\n    public int ArrayLength { get { return arrayLength; } }\r\n\r\n    float&#x5B;] samples;\r\n\r\n    &#x5B;SerializeField]\r\n    FFTWindow fourierTransform = FFTWindow.Triangle;\r\n\r\n    internal bool IsPowerOfTwo(int x)\r\n    {\r\n        return (x &gt; 0 &amp;&amp; (x &amp; (x - 1)) == 0);\r\n    }\r\n\r\n    void OnValidate()\r\n    {\r\n        while (!IsPowerOfTwo(arrayLength))\r\n        {\r\n            arrayLength++;\r\n        }\r\n    }\r\n\r\n    void Awake()\r\n    {\r\n        if (Instance != null &amp;&amp; Instance != this)\r\n        {\r\n            Destroy(gameObject);\r\n            return;\r\n        }\r\n        Instance = this;\r\n    }\r\n\r\n    void Start()\r\n    {\r\n        samples = new float&#x5B;arrayLength];\r\n    }\r\n\r\n    void Update()\r\n    {\r\n        if (samples == null)\r\n            return;\r\n\r\n        AudioListener.GetSpectrumData(\r\n            samples,\r\n            0,\r\n            fourierTransform\r\n            );\r\n    }\r\n\r\n    public float GetAudioData(int index)\r\n    {\r\n        Assert.IsTrue((index &lt; arrayLength &amp;&amp; index &gt;= 0));\r\n        return samples&#x5B;index];\r\n    }\r\n}\r\n<\/pre>\n<\/p>\n<h2 style=\"text-align: center;\"><strong><span style=\"color: #00ccff;\">Audio Wave\u00a0Effect<\/span><\/strong><\/h2>\n<p style=\"text-align: justify;\">This component will manipulate the line renderer component on the game object created during the setup section. To accomplish this a reference to that component must first be set as a field in our script in the <strong>Start<\/strong> method. Setting this reference as a field is good practice with regards to using Mono since using the\u00a0<strong>GetComponent<\/strong> method is a fairly intensive operation so it&#8217;s best to do it as rarely as possible.<\/p>\n<p style=\"text-align: justify;\">After setting a reference to the line renderer next a reference to our\u00a0<strong>AudioSampler<\/strong> script should be established. To get this reference you only need to utilize the singleton design pattern implemented in the class through the\u00a0<strong>Instance\u00a0<\/strong>public static field. Again set this reference as a field in the class for later use.<\/p>\n<p style=\"text-align: justify;\">Next we need to set the number of vertices for our line, this should be the same number of samples in our samples array in the\u00a0<strong>AudioSampler<\/strong>\u00a0script so we can match each vertex with a given sample value. Obtain the\u00a0<strong>arrayLength<\/strong> value from the\u00a0<strong>AudioSampler<\/strong> and use it to create an array of\u00a0Vector3. After the Vector3 array is declared and initialized with the correct length next the vertex positions must be setup. I suggest creating two public Vector3 fields to store the start position and end position of your line with both points being offset along a single axis. You could define these points anywhere in space however for simplicity sake I am going to set mine along the X-axis only. Using these Vector3&#8217;s we can calculate the offset our vertices need to set at given the number of samples like so:<\/p>\n<p style=\"text-align: center;\"><strong>float distance = (lStart &#8211; lEnd).magnitude \/ arrayLength;<\/strong><\/p>\n<p style=\"text-align: justify;\">Then to find the next point in space for a vertex we create a vector from the starting Vector3 to the ending Vector3, normalize it, then scale it using our distance float. The resulting vector allows us to step from one vertex to the next using a simple for loop remembering to store each position in our Vector3 array and adding our offset vector. After the loop is terminated we need only record our lines end point to the Vector3 array at the last index then pass the entire array to the line renderer component like so:<\/p>\n<p style=\"text-align: center;\"><strong>lRender.SetPositions(vArray);<\/strong><\/p>\n<p style=\"text-align: justify;\">Now our line is setup in script and we have an array containing each vertex so the last task is to code the\u00a0<strong>Update\u00a0<\/strong>method to manipulate the line based on the audio data. Again for simplicity sake I&#8217;m restraining all the vertex manipulations to the Y-axis for this example so that the line will only bend upward.\u00a0First I declare a Vector3, which will represent the amount of offset for a line vertex, which I initialize to <strong>Vector3.zero<\/strong>.\u00a0The next step is to declare a for loop that will step through all our sample data array then within the body of the loop we set our offset Vector3 Y-component to the value of the sample at that index after which we pass the new vector position to our line renderer like so:<\/p>\n<p style=\"text-align: center;\"><strong>lRender.SetPosition(index, vArray[index] + offset);<\/strong><\/p>\n<p style=\"text-align: justify;\">Finally we can close the for loop and finish the\u00a0<strong>Update\u00a0<\/strong>method. At this point you should be able to plug a sound file into your Audio Source within the unity editor and observe the line moving. If the lines motion isn&#8217;t noticeable enough I would recommend either upping the volume on your Audio Source or creating a float within the Synthesizer Effect to multiply the offset, as can be seen within the sample code below.<\/p>\n<p style=\"text-align: center;\">\u00a0<\/p>\n<pre class=\"brush: csharp; title: ; notranslate\" title=\"\">\r\npublic class AudioWaveEffect : MonoBehaviour \r\n{\r\n    LineRenderer lRender;\r\n    AudioSampler aSampler;\r\n\r\n    &#x5B;SerializeField]\r\n    Vector3 lStart, lEnd;\r\n    Vector3&#x5B;] vArray;\r\n\r\n    &#x5B;SerializeField]\r\n    &#x5B;Range(-100f, 100f)]\r\n    float gain = 1f;\r\n\r\n    int length = 0;\r\n\r\n    void Awake()\r\n    {\r\n        lRender = GetComponent&lt;LineRenderer&gt;();\r\n    }\r\n\r\n    void Start()\r\n    {\r\n        aSampler = AudioSampler.Instance;\r\n        if (aSampler == null || lRender == null)\r\n        {\r\n            Destroy(gameObject);\r\n            return;\r\n        }\r\n\r\n        length = aSampler.ArrayLength;\r\n        LineSetup();\r\n    }\r\n\r\n    internal void LineSetup()\r\n    {\r\n        float distance = (lStart - lEnd).magnitude \/ length;\r\n       \r\n        Vector3 lVertex = lStart;\r\n        Vector3 offset = (lStart - lEnd).normalized * distance;\r\n        \r\n        Vector3&#x5B;] lVerts = new Vector3&#x5B;length];\r\n        lRender.SetVertexCount(length);\r\n\r\n        for (int i = 0; i &lt; length - 1; i++)\r\n        {\r\n            lVerts&#x5B;i] = lVertex;\r\n            lVertex -= offset;\r\n        }\r\n\r\n        lVerts&#x5B;length - 1] = lEnd;\r\n        lRender.SetPositions(lVerts);\r\n        vArray = lVerts;\r\n\r\n    }\r\n\r\n    void Update()\r\n    {\r\n        Vector3 audio = Vector3.zero;\r\n        for (int i = 0; i &lt; length; ++i)\r\n        {\r\n            audio.y = aSampler.GetAudioData(i) * gain;\r\n            lRender.SetPosition(i, vArray&#x5B;i] + audio);\r\n\r\n        }\r\n    }\r\n\r\n}\r\n\r\n<\/pre>\n<\/p>\n<p><!--nextpage--><\/p>\n<h2 style=\"text-align: center;\"><span style=\"color: #33cccc;\"><strong>Audio Sound Wave<\/strong><\/span><\/h2>\n<p><img loading=\"lazy\" decoding=\"async\" data-attachment-id=\"546\" data-permalink=\"https:\/\/magiteker.com\/index.php\/2016\/09\/02\/unity-synthesizer\/soundwave\/\" data-orig-file=\"https:\/\/i0.wp.com\/magiteker.com\/wp-content\/uploads\/2016\/09\/SoundWave.png?fit=1158%2C416&amp;ssl=1\" data-orig-size=\"1158,416\" data-comments-opened=\"1\" data-image-meta=\"{&quot;aperture&quot;:&quot;0&quot;,&quot;credit&quot;:&quot;&quot;,&quot;camera&quot;:&quot;&quot;,&quot;caption&quot;:&quot;&quot;,&quot;created_timestamp&quot;:&quot;0&quot;,&quot;copyright&quot;:&quot;&quot;,&quot;focal_length&quot;:&quot;0&quot;,&quot;iso&quot;:&quot;0&quot;,&quot;shutter_speed&quot;:&quot;0&quot;,&quot;title&quot;:&quot;&quot;,&quot;orientation&quot;:&quot;0&quot;}\" data-image-title=\"SoundWave\" data-image-description=\"\" data-image-caption=\"\" data-medium-file=\"https:\/\/i0.wp.com\/magiteker.com\/wp-content\/uploads\/2016\/09\/SoundWave.png?fit=512%2C184&amp;ssl=1\" data-large-file=\"https:\/\/i0.wp.com\/magiteker.com\/wp-content\/uploads\/2016\/09\/SoundWave.png?fit=1024%2C368&amp;ssl=1\" class=\"aligncenter size-medium wp-image-546\" src=\"https:\/\/i0.wp.com\/magiteker.com\/wp-content\/uploads\/2016\/09\/SoundWave.png?resize=512%2C184&#038;ssl=1\" alt=\"SoundWave\" width=\"512\" height=\"184\" srcset=\"https:\/\/i0.wp.com\/magiteker.com\/wp-content\/uploads\/2016\/09\/SoundWave.png?resize=512%2C184&amp;ssl=1 512w, https:\/\/i0.wp.com\/magiteker.com\/wp-content\/uploads\/2016\/09\/SoundWave.png?resize=768%2C276&amp;ssl=1 768w, https:\/\/i0.wp.com\/magiteker.com\/wp-content\/uploads\/2016\/09\/SoundWave.png?resize=1024%2C368&amp;ssl=1 1024w, https:\/\/i0.wp.com\/magiteker.com\/wp-content\/uploads\/2016\/09\/SoundWave.png?w=1158&amp;ssl=1 1158w\" sizes=\"auto, (max-width: 512px) 100vw, 512px\" data-recalc-dims=\"1\" \/><\/p>\n<p style=\"text-align: justify;\">Now that we have the simple data affecting the line renderer it&#8217;s time to make it resemble something closer to what a sound wave should look like. In order to achieve this it&#8217;s necessary to recognize what the data Unity is providing us from the\u00a0<strong>AudioListener<\/strong>.<\/p>\n<p style=\"text-align: justify;\">If we examine the raw data from a sample it appears to be very small values between 0 and 1 which tells us that this is the angular frequency of the sound at that specific sample. In basic\u00a0<strong>Audio Wave Effect<\/strong> their were only lines being pushed up based on this data, what is actually being represented is a series of sine waves being viewed from their narrowest vantage meaning stacked together.<\/p>\n<p style=\"text-align: justify;\">In order to decouple these sine waves we need to utilize the audio data in a new equation to form each individual wave. As such we begin by allocating an array of Vector3 for each point on the sine wave. Next we step through the audio data for the frame for each sine wave we wish to create, or rather for the number of samples our\u00a0<strong>AudioSampler<\/strong> recordered. For each sample we loop over our array of Vector3 representing our final sound wave adding in the current samples sound wave using this formula:<\/p>\n<p style=\"text-align: center;\"><strong>fWave[j] += Vector3.up * gain * sample * Mathf.Sin((gain * sample * i * length * 2 * Mathf.PI) + dist);<\/strong><\/p>\n<p style=\"text-align: justify;\">Breaking apart this formula we have several operations going on, first we are obtaining the world up vector in order to make sure our result is a vector. Next we have the\u00a0<strong>Amplitude<\/strong> represented as the\u00a0<strong>gain<\/strong>\u00a0multiplied by\u00a0the\u00a0<strong>sample<\/strong>. Finally we have the actual sine wave function with the angle being calculated as the\u00a0<strong>gain\u00a0<\/strong> times the\u00a0<strong>sample\u00a0<\/strong>times\u00a0the\u00a0<strong>frequency\u00a0<\/strong>times the\u00a0<strong>length\u00a0<\/strong>times 2 PI, then we do what is known as a phase shift using the distance this sample is from the start of our line.<\/p>\n<p style=\"text-align: justify;\">Despite the bulky math this is a fairly physically accurate representation of an audio wave using the Unity <strong>Line Renderer<\/strong>. The final task is similar to previous effect in which we just pass the vertex data to our line renderer for that frame to be drawn to the screen.<\/p>\n<p style=\"text-align: center;\">\n<pre class=\"brush: csharp; title: ; notranslate\" title=\"\">\r\npublic class SynthesizerEffect : MonoBehaviour {\r\n    LineRenderer lRender;\r\n    AudioSampler aSampler;\r\n\r\n    &#x5B;SerializeField]\r\n    Vector3 lStart, lEnd;\r\n    Vector3&#x5B;] vArray;\r\n\r\n    &#x5B;SerializeField]\r\n    &#x5B;Range(1f, 10000f)]\r\n    float gain = 1f;\r\n    &#x5B;SerializeField]\r\n    &#x5B;Range(1, 64)]\r\n    int numWaves = 1;\r\n\r\n    int length = 0;\r\n\r\n    void Awake()\r\n    {\r\n        lRender = GetComponent&lt;LineRenderer&gt;();\r\n    }\r\n\r\n    void Start()\r\n    {\r\n        aSampler = AudioSampler.Instance;\r\n        if (aSampler == null || lRender == null)\r\n        {\r\n            Destroy(gameObject);\r\n            return;\r\n        }\r\n\r\n        length = aSampler.ArrayLength;\r\n        LineSetup();\r\n    }\r\n\r\n    internal void LineSetup()\r\n    {\r\n        float distance = (lStart - lEnd).magnitude \/ length;\r\n       \r\n        Vector3 lVertex = lStart;\r\n        Vector3 offset = (lStart - lEnd).normalized * distance;\r\n        \r\n        Vector3&#x5B;] lVerts = new Vector3&#x5B;length];\r\n        lRender.SetVertexCount(length);\r\n\r\n        for (int i = 0; i &lt; length - 1; i++)\r\n        {\r\n            lVerts&#x5B;i] = lVertex;\r\n            lVertex -= offset;\r\n        }\r\n\r\n        lVerts&#x5B;length - 1] = lEnd;\r\n        lRender.SetPositions(lVerts);\r\n        vArray = lVerts;\r\n\r\n    }\r\n\r\n    void Update()\r\n    {\r\n        Vector3&#x5B;] fWave = new Vector3&#x5B;length];\r\n\r\n        for (int i = 0; i &lt; numWaves; ++i)\r\n        {\r\n\r\n            float freq = aSampler.GetAudioData(i);\r\n            for (int j = 0; j &lt; length; ++j)\r\n            {\r\n                var dist = (vArray&#x5B;0] - vArray&#x5B;j]).magnitude;\r\n                fWave&#x5B;j] += Vector3.up * gain * freq *\r\n                    Mathf.Sin(\r\n                        (gain * freq * i * length * 2 * Mathf.PI) + dist\r\n                        );\r\n            }\r\n        }\r\n\r\n        for(int i=0; i &lt; length; ++i)\r\n        {\r\n            lRender.SetPosition(\r\n                i,\r\n                vArray&#x5B;i] + fWave&#x5B;i] \/ length\r\n                );\r\n        }\r\n\r\n\r\n\r\n    }\r\n\r\n}\r\n\r\n<\/pre><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Games are dynamic systems responding on different types of data inputs to create interesting results for the player. One such source of data is Audio which can at its best help the player have an emotional connection to what&#8217;s happening on screen. So with this in mind utilizing audio is many different ways can improve [&hellip;]<\/p>\n","protected":false},"author":2,"featured_media":539,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"jetpack_post_was_ever_published":false,"_jetpack_newsletter_access":"","_jetpack_dont_email_post_to_subs":true,"_jetpack_newsletter_tier_id":0,"footnotes":"","jetpack_publicize_message":"","jetpack_publicize_feature_enabled":true,"jetpack_social_post_already_shared":true,"jetpack_social_options":{"image_generator_settings":{"template":"highway","enabled":false}}},"categories":[21,3,2,5],"tags":[],"class_list":["post-517","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-code","category-design_patterns","category-projects","category-tutorials"],"jetpack_publicize_connections":[],"jetpack_featured_media_url":"https:\/\/i0.wp.com\/magiteker.com\/wp-content\/uploads\/2016\/09\/SynthEffect.png?fit=1154%2C418&ssl=1","jetpack_sharing_enabled":true,"jetpack_shortlink":"https:\/\/wp.me\/p6MIgq-8l","jetpack-related-posts":[],"_links":{"self":[{"href":"https:\/\/magiteker.com\/index.php\/wp-json\/wp\/v2\/posts\/517","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/magiteker.com\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/magiteker.com\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/magiteker.com\/index.php\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/magiteker.com\/index.php\/wp-json\/wp\/v2\/comments?post=517"}],"version-history":[{"count":31,"href":"https:\/\/magiteker.com\/index.php\/wp-json\/wp\/v2\/posts\/517\/revisions"}],"predecessor-version":[{"id":584,"href":"https:\/\/magiteker.com\/index.php\/wp-json\/wp\/v2\/posts\/517\/revisions\/584"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/magiteker.com\/index.php\/wp-json\/wp\/v2\/media\/539"}],"wp:attachment":[{"href":"https:\/\/magiteker.com\/index.php\/wp-json\/wp\/v2\/media?parent=517"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/magiteker.com\/index.php\/wp-json\/wp\/v2\/categories?post=517"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/magiteker.com\/index.php\/wp-json\/wp\/v2\/tags?post=517"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}