Skip to content

relative velocity and ground velocity with the unified rectangluar coordinate system for a physics pixel model

The unified rectangular coordinate system could be moving ego/local coordinate system for a physics pixel model, in which the origin is the observer/camera/device of the physic pixel model.

It’s better to include both relative velocity and ground velocity along with the unified rectangular coordinate system for a physics pixel model, and also include velocity and rotation of the origin of the unified rectangular coordinate system relative to the stationary ground.

The relative velocity vector of an object/point is relative to the origin of the unified rectangular coordinate system.

The ground velocity vector of an object/point/origin is relative to the stationary ground.

The ground rotation vector of an object/point/origin is relative to the stationary ground.

Below is the summary of physics pixel including reflection and relative/ground speeds.

******

******

I thought that it’s necessary to identify the reflection in mirror better, for it’s essential for not only video generative application but also control scenarios like mirrors set at corner for cars or robot smashing into mirrors.

In fact, all physics pixels visible except luminating ones have a reflection rate above 0 to reflect light to be visible, so as for this aspect, these physics pixels reflecting may all include RGBA from other physics pixel reflected by them. So it’s important to deal with reflection effect.

A virtual image physics pixel in mirror is a reflection image on the extended straight ray view of its corresponding pixel into the mirror surface,
reflected from a reflected physics pixel which is on opposite side of the mirror and is on the reflected ray view of the correponding pixel,
reflected by a reflecting physics pixel which is on the mirror/reflecting surface and is on the ray view of the corresponding pixel.

So the key is how to represent virtual “image physics pixel” in the mirror, real “reflected physics pixel” on the opposite of the mirror, and the real “reflecting physics pixel” on the surface of the mirror.

Previously I introduced a prefix “TNC” for coordinates XYZ and X’Y’Z’, which can be used to tag phsics pixels related to mirror effect. So set the highest 3 bits of “C” of “TNC” as identifier for mirror effect in which C= mirror bits (3 bits) + coordinate system type (5 bits), and mirror bits are:
the highest bit represents whether it’s an image physics pixel or not: 0 is not and 1 is yes,
the second highest bit represents whether it’s an reflected physics pixel or not: 0 is not and 1 is yes,
the third highest bit represents whether it’s an reflecting physics pixel or not: 0 is not and 1 is yes.

As for coordinate system type (5 bits), coordinate type=”00001″ represents sensor’s own perspective coordinate system, and coordinate type=”00010″ represents the unified rectangular coordinate system.

A pair of image physics pixel and reflected physics pixel is represented by one physics pixel.

And also, make each physics pixel include 2 sets of RGBA as RGBA_1 and RGBA_2:
for a real physics pixel (not image/reflected physics pixel), RGBA_1=RGBA_2;
for a physics pixel representing a pair of image/reflected physics pixel, RGBA_1 is of the image physics pixel and RGBA_2 is of the reflected physics pixel, and all parameters of this physic pixel including X’Y’Z’ are of the reflect physics pixel except RGBA_1 and XYZ which are for the image physics pixel.

Following are the related parameters of 3 related physics pixels of an “image physics pixel”, from which AI model learn and understand the physics nature of the image physics pixel in labeled training.

1>> A pair of an image physics pixel and its corresponding reflected physics pixel include one set of parameters for one physics pixel,

wherein XYZ is the coordinates of the virtual position of the image physics pixel pixel in the camera’s ray view or perspective coordinate system, and RGBA_1 is of the image physics pixel too,

wherein all other parameters except XYZ and RGBA_1 are of the corresponding reflected physics pixel,

2>> the parameters of the image physics pixel

2.1> TNC+XYZ, XYZ is the perspective coordinates of the virtual position of the image physics pixel in camera’s view, and the image physics pixel doesnt have its own X’Y’Z’ mapped from its XYZ, and XYZ of the image physics pixel corresponds to the X’Y’Z’ of the reflected physics pixel,

wherein C=”100-00001″, C=mirror bits+ type, mirror bits=”100″ represents image physics pixel, coordinate type=”1″ represents the camera’s perspective coordinate system,

8bits: T=sensor type, (“1” means camera, “2” means mm wave radar, “3″ means microphone),

8bits: N=sensor ID, (SN of the sensor in its own type),

2.2> RGBA_1 for the image physics pixel, wherein the model can learn the correlation between RGBA_1 and RGBA_2 in labeled training.

3>> the parameters of the reflected physics pixel

3.1> X’Y’Z’ of the reflected physics pixel is corresponding to the XYZ of the image physics pixel, and the reflected physics pixel doesnt have its own XYZ mapping to its X’Y’Z’,

In C+X’Y’Z’ of the reflected physics pixel, C=”010-00010″,

the direction vector of X’Y’Z’ of the reflected physics pixel is from the near/first object to the far/second object along the reflected ray view of the corresponding pixel,

the direction vector of X’Y’Z’ of a physics pixel is perpendicular to the interface of the physics pixel at the point of the physics pixel.

3.2> RGBA_2 for the reflected physics pixel, which can be labeled in labeled training to let the model learn the correlation between RGBA_2 and RGBA_1 of the image physics pixel.

4>> the parameters of the reflecting physics pixel

4.1> the reflection rate parameter,

if the reflection rate parameter of a physics pixel on a surface is above 0, then the RGBA of this physics pixel include RGBA of one or more other reflected physics pixels, through one or more image physics pixels in extended ray view of its pixel deep into the surface of mirror effect;

not all physics pixels with non zero reflection rate need to be reflecting physics pixel, and some of them can ignore the reflection effect when it doesnt matter;

for a reflecting physics pixel, there must be at least one pair of image physics pixel and reflected physics pixel correspondingly;

there could be a physics pixel with reflection rate above 0 which is not classified as an reflecting pixel, which means the physics pixel has reflecting effect but has corresponding no image/reflected physics pixel pair for the image is invisible or dosent matter.

4.2> in TNC+XYZ of the reflecting physics pixel, C=mirror bits+coordinate type=”001-00001″,

4.3> X’Y’Z’ and the direction vector of X’Y’Z’

X’Y’Z’ and the direction vector of X’Y’Z’ can let the model learn the direction of the incident light reflected by the reflecting pixel,

in C+X’Y’Z’ of the reflecting physics pixel, C=”001-00010″,

4.4> RGBA of the reflecting physics pixel shall be like a transparent physics pixel through which the camera can see the image physics pixel,

the reflecting physics pixel is a real physics pixel and not an image physics pixel, so RGBA_1 and RGBA_2 of the reflecting physics pixel are identical.

5>> multiple reflection

5.1> In a multiple reflection case, at least the first reflected pixel is also a virtual image physics pixel, so in its C+X’Y’Z’, C is “110-00010”, and X’Y’Z’ is the rectangular coordinates of its virtual position. And all other parameters except XYZ and X’Y’Z’ shall be for the final real physics pixel.

5.2> for one bounce/reflection case, the image/reflected physics pixel pair can be represented in one set of physics pixel parameters with RGBA_1 and RGBA_2 respectively,

for multiple bounce/reflection case, the image/reflected physics pixels are more than 2 and so they need to be represented in more than one set of physics pixel parameters.

********

********

The summary of physics pixel

The physics pixel establishes first perspective physics world model for camera (or similar sensor) in the true sense, in which the whole space is filled up by objects with interfaces in between from camera to deep depth, and a pixel’s ray view passes through interfaces with a physics pixel on each intersection, wherein even a point corresponding to bare sky can be regarded as located on an interface at a certain height between an object of the outdoor space and another object of outer space.

Each pixel in a visual frame from a camera is mapped to one or more physics pixels along its ray of view at one or more different depths.

Each physics pixel is located on an interface between (two surfaces of) two objects , wherein the physics pixel is also located on 2 points on the (two surfaces of) two objects, wherein the physics pixel corresponds to:
an interface,
two objects(near/first object nearer to the camera and far/second object farther from the camera),
two points (near point on the near surface of the near object and far point on the far surface of the far object),
two surfaces (near surface of the near object and far surface on the far object).

Therefore, each physics pixel may include parameters:
for the physics pixel (XYZ, mapped rectangular X’Y’Z’ with direction vector, RGBA_1, RGBA_2),
for the interface (interface ID, refractive index, overlap number),
for the two objects (object ID, object class, parent object, velocity, rotation, mass, etc ),
for the two points (velocity, rotation, pressure, temperature, material, etc),
for the two surfaces (surface ID, text on surface, reflection rate, material, etc).

In labeled training, transparent physics pixels and invisible physics pixels can also be labeled, to let the model learn the correlation between invisible part and visible part, to establish a spatial understanding of wholen space including invisible part.

This approach is particularly well-suited for artificial intelligence applications, enabling more accurate modeling of real-world physical interactions and spatial relationships at the pixel level.

Below is the classical example: a camera, through a window of a house, see a wall in the house, and this forms 4 interfaces overlapped, in which: first interface is between camera (near object ID:111) and outdoor space (far object ID: 000), second interface is between outdoor space (000) and glass of window (123), third interface is between glass of window (123) and interior space of house (456), forth interface is between interior space of house (456) and wall in house (789).

———-

Below is shared parameters of physics pixels in a visual frame.

———-

ground velocity vector of the origin of the unified rectangular coordinate system;

ground rotation vector of the origin of the unified rectangular coordinate system.

———-

There are following physics pixels along the ray view of a pixel in the visual frame of the camera.

———-

0. parameters of the pixel

RGB of a pixel in a visual frame;

position of the pixel in the visual frame.

———-

1. first physics pixel on 1st interface

1.1) parameters for the physics pixel

TNC+coordinates (X, Y, Z) in perspective coordinate system of the camera, XY correspond to the pixel’s position in the visual frame, Z corresponds to the distance/depth from the camera to the physics pixel, in which XYZ could be in different metric unit like XY could be in number of pixels and Z is in mm or micrometre;

RGBA_1;

// RGBA_1 is for the physics pixel of XYZ which may be image physics pixel //

C+coordinates (X’, Y’, Z’), mapping of (X, Y, Z) into a unified rectangular coordinate system, all X’Y’Z’ could be in mm or micrometre;

// for a real physics pixel, XYZ and X’Y’Z’ above are mapping to each other //

// for a pair of an image physics pixel and its corresponding reflected physics pixel, XYZ above are virtual coordinates of the image physics pixel in camera’s perspective view, and X’Y’Z’ above are real coordinates of the reflected physics pixel in unified rectagular coordinate system, and the pair of the two share all other parameters except XYZ and X’Y’Z’ //

direction vector of (X’, Y’, Z’) from near object to far object in the rectangular coordinate system;

// the direction vector of X’Y’Z’ of a physics pixel is perpendicular to the interface of the physics pixel at the point of the physics pixel //

// the direction vector of X’Y’Z’ of the reflected physics pixel is from the near/first object to the far/second object along the reflected ray view of the corresponding pixel //

RGBA_2;

// RGBA_2 is for the physics pixel of X’Y’Z’ which may be reflected physics piel//

// for a real physics pixel, only one RGBA is needed or RGBA_1 and RGBA_2 are same//

// for a pair of image physics pixel and its corresponding reflected physics pixel, RGBA_1 is for the image and RGBA_2 is for the reflected//

object depth of the far object along the physics pixel’s ray view: the distance from 1st interface to 2nd interface along the ray view of the pixel corresponding to the physics pixel;

———-

1.2) parameters for the interface

Interface ID;

Overlap number of the interface = 0;

refractive index;

———-

1.3) parameters of near object and far object

1.3.1) parameter of the near object

object id of the near object: 111;

object class: camera;

parent object: observer;

relative velocity vector;

ground velocity vector;

gound rotation vector;

mass of the object;

———-

1.3.2) parameter of the far object

object id of the object: 000;

object class: outdoor space;

parent object: None;

relative velocity vector;

ground velocity vector;

gound rotation vector;

mass of the object;

———-

1.4) parameters of the 2 points which the physics pixel is located on the near object and far object respectively

1.4.1) parameters of the near point on the near object

relative velocity vector;

ground velocity vector;

gound rotation vector;

material of the point: glass;

pressure on the point;

temperature on the point;

reflection rate;

———-

1.4.2) parameters of the far point on the far object

relative velocity vector;

ground velocity vector;

gound rotation vector;

material of the point: air;

pressure on the point;

temperature on the point;

reflection rate;

———-

1.5) parameters for the two surfaces of the two objects forming the interface

1.5.1) parameters for the near surface of the near surface

surface ID;

reflection rate;

material;

text on surface (this parameter can be put into the dedicated text field of omnitoken) .

———-

1.5.2) parameters for the far surface of the far surface

surface ID;

reflection rate;

material;

text on surface (this parameter can be put into the dedicated text field of omnitoken) .

———-

2. 2nd physics pixel on 2nd interface

……

3. 3nd physics pixel on 3rd interface

……

4. 4th physics pixel on 4th interface

……

Published inUncategorized

Be First to Comment

Leave a Reply

Your email address will not be published. Required fields are marked *