News:

Attention: For security reasons,please choose a user name *different* from your login name.
Also make sure to choose a secure password and change it regularly.

Main Menu

how to run te0726_m_demo2 on the board

Started by pantho2008, April 17, 2017, 06:35:36 PM

Previous topic - Next topic

pantho2008

Hello,
I am new with the zynqberry. I am trying to have the te0726_m_demo2 (2016.2)  running on the board. But I couldn't find the right instruction to do so. The resource file does not have any proper instruction file set with it.

I was able to have te0726_m_demo1 running on the board. The same steps does not work with the demo2. I don't get a terminal. And no camera image is not the screen as well. If some one can direct me to where to look for, It would be really helpful.

Thorsten Trenz

Hi,
did you follow the steps listed on this page?

https://shop.trenz-electronic.de/en/Download/?path=Trenz_Electronic/TE0726/Reference_Design/2016.2/te0726_m_demo2

What is your exact version of TE0726? Maybe give us the serial number on the board.

Best Regards,
Thorsten Trenz

pantho2008

Hello,
Thank you for your response. I was mistakenly skipping a simple step in the middle. But now it is working perfectly. However, I have one extra question. How can I read the camera image from the linux on this design?

Thank you in advance.

Oleksandr Kiyenko

Hello,

You can make screenshots from camera framebuffer. You can crosscompile fbgrab or similar utilite for that.
Or if you want to process raw image, you can map framebuffer RAM in your application and read image in framebuffer format a8b8g8r8.

Best regards
Oleksandr Kiyenko

pantho2008

#4
Hello,
To get screenshot, i installed fbgrab and tried with this command < fbgrab -d /dev/fb1 screen.png > . It only gives me an yellow image with few red green dots. But not the camera image.

To get the raw image and serve it over HTTP I used the following code.

borrowed most parts from : https://lauri.xn--vsandi-pxa.com/hdl/zynq/xilinx-video-capture.html
-----------------------------------------------------------------------------------
import os, png, mmap
from http.server import BaseHTTPRequestHandler, HTTPServer
import ssl
import time
FRAMEBUFFER_OFFSET=0x1F700000
WIDTH = 1280
HEIGHT = 720
PIXEL_SIZE = 4
fh = os.open("/dev/mem", os.O_SYNC | os.O_RDONLY) # Disable cache, read-only
mm = mmap.mmap(fh, WIDTH*HEIGHT*PIXEL_SIZE, mmap.MAP_SHARED, mmap.PROT_READ, offset=FRAMEBUFFER_OFFSET)
class MyHandler(BaseHTTPRequestHandler):
    def do_GET(s):
        writer = png.Writer(WIDTH, HEIGHT, alpha=True)
        s.send_response(200)
        s.send_header("Content-type", "image/png")
        s.end_headers()
        writer.write_array(s.wfile,mm[0:WIDTH*HEIGHT*PIXEL_SIZE] )
httpd = HTTPServer(("0.0.0.0", 8910), MyHandler)
try:
    httpd.serve_forever()
except KeyboardInterrupt:
    pass
httpd.server_close()
mm.close()
fh.close()
---------------------------------

But I am getting empty frames on that framebuffer offset. I think I am missing something obvious.  Do I need to enable camera separately? Is it possible to share the resource file to read the camera image frame buffers.

Thank you

Oleksandr Kiyenko

Hello,
code you listed is not suited for this project, I have to modify it


import os, png, mmap, BaseHTTPServer

#FRAMEBUFFER_OFFSET=0x1F700000
FRAMEBUFFER_OFFSET=0x1FC00000
WIDTH = 1280
HEIGHT = 720
PIXEL_SIZE = 4

fh = os.open("/dev/mem", os.O_SYNC | os.O_RDONLY) # Disable cache, read-only
mm = mmap.mmap(fh, WIDTH*HEIGHT*PIXEL_SIZE, mmap.MAP_SHARED, mmap.PROT_READ, offset=FRAMEBUFFER_OFFSET)

def get_pixels():
        result = []
        p = 0
        for d in mm[0:WIDTH*HEIGHT*PIXEL_SIZE]:
                if p%4 == 0:
                        a = d
                if p%4 == 1:
                        b = d
                if p%4 == 2:
                        c = d
                if p%4 == 3:
                        result.append(a)
                        result.append(b)
                        result.append(c)
                p = p + 1
        return result

class MyHandler(BaseHTTPServer.BaseHTTPRequestHandler):
def do_GET(s):
writer = png.Writer(WIDTH, HEIGHT, alpha=True)
s.send_response(200)
s.send_header("Content-type", "image/png")
s.end_headers()
writer.write_array(s.wfile, [ord(j) for j in get_pixels()])

httpd = BaseHTTPServer.HTTPServer(("0.0.0.0", 80), MyHandler)
try:
httpd.serve_forever()
except KeyboardInterrupt:
pass
httpd.server_close()

mm.close()
fh.close()



Use
FRAMEBUFFER_OFFSET=0x1FC00000
for HDMI image
FRAMEBUFFER_OFFSET=0x1F700000
for camera image
Note thay you have to init camera before use
rpi-camera /dev/i2c-5

Best regards
Oleksandr Kiyenko

pantho2008

Hello,
Thank you for your generous response. You are the best help I got over the internet.
So this is what happened, the debian trenz provided in demo2 does not have rpi-camera application in it. So I build the petalinux in os/ directory. Copy the rpi-camera executable  to the bin folder and then enabled the camera with < rpi-camera /dev/i2c-5 >.
I executed your code with python2.7. Although the hdmi image is somewhat coming. For the camera image I am getting a pink colored image like the attachments. I started to believe I am going somewhere. But the camera image is not there yet. Please let me know, what is the reason behind it. I would be really grateful for this.

Oleksandr Kiyenko

Hello,

Also you have to start video stream using
devmem 0x43c10040 32 1


Best regards
Oleksandr Kiyenko