Apr 20, 2012

Simple web server

Ever find you need to implement a web server or provide some web pages on a local network? Maybe you don't have an Apache server up and running or don't want to go to the trouble of configuring it. I've found this with Natural Docs documentation, where the tools will generate a bunch of .html files that you can read locally if you are on the machine where they are generated. Pygments also will generate html formatted files that you might want to serve. However, you might want to make them accessible to a small team. Here is a quick trick that can make this task very simple. Python ships with a basic HTTP server module. You can get it up and running, serving files from a given directory down with the following command:

python -m SimpleHTTPServer

and that's all there is to it! The default port will be 8000, but you can provide a different port on the command line (just add the port number after the SimpleHTTPServer - any number above reserved range of 0-1024 will work. The server will be accessible to any machines that can see that port on the machine it is running on. The server won't handle a high load and isn't particular secure, so I wouldn't make it available to the general public. But if you need to serve some pages quickly and simply, to a small number of users within a private network, this can be a really fast way to get to that point. If and when the load becomes an issue, or security concerns are important, then a heavyweight server like Apache is a much better choice.

Once the server is up and running you access it via a web browser at http://machine_name:8000 File paths are all relative to the directory the server is running in. If there is an index.html in that directory, it will be loaded by default when you access the server at that URL.

There are comments.

Aug 20, 2008

show my pc

frisco shore

I'm on an extended work trip at the moment. My wife and I have been keeping touch with Skype, using the video conferencing and VoIP on a daily basis. It's great to be able to see each other and talk without speaking in to a phone. I've come to realise the value of a good microphone and proper speaker setup, as well as a decent camera. In general it has been really effective, at least within the US. Typically fast, quite clear audio and good picture quality.

We also needed to look at some documents on the computer. Skype doesn't have a remote desktop sharing option and we had some problems getting the Windows Live Messenger working to share screens and there would be the added difficulty of sharing from Windows XP to my Mac (I'd have been using a VM Fusion Windows session). Instead, we tried http://www.showmypc.com. I was impressed how easy it was to set up, just download an application and run it (a bit of trust required there). Then there is a password that you share with the user at the other end of the link and you are connected. Under the hood it sets up a VNC session via an SSH server, but from an end user perspective, all you need is the password to connect. It was fast, easy to use and worked well Windows XP <-> OS X. They have options where you set up your own SSH server in the middle, rather than trusting their servers, but I think you have to pay for that option. In general, it was surprisingly easy and effective. We were able to still have the Skype video call running in parallel too.

The picture of the beach? That's the Outer Banks in North Carolina - about 4 hours from where I'm working at the moment. I took the opportunity to go there for the weekend and camp out in the sand. Traveling for work isn't ideal but I try to make the most of the places I do get to go, in particular seeing the area or taking advantage of museums and art galleries I wouldn't normally have the chance to visit. In that respect, I tend to disagree with Grant Martin's sentiment over on his blog. I've still been able to disconnect on business trips and see a bit of the local culture. Two weeks ago I saw a Monet at the North Carolina museum of art. This weekend I was in the Outer Banks. I think you have to push yourself a bit to really take advantage of the opportunities that might be there, but it can certainly be done. For every business trip I've been on in the last few years, I've really worked to get to at least one museum or art gallery, just to make it worthwhile.

There are comments.

Jun 20, 2008

a somewhat crazy notion

Death Valley, after the storm

Many ideas have been whirling around in my head since being at DAC. I've been inspired to learn some new things, starting with the Open Verification Methodology but also revisiting some of the Electronic System Level tools and flows that I've worked with in the past. I'm interested in exploring visualization techniques and tools and how they might be applied to verification and design. I'd also like to learn more about a few of the more interesting formal verification tools, like OneSpin 360 MV and maybe explore what is possible with ESL tools like Bluespec's SystemVerilog flow or the various other similar tools that are out there.

I have a difficult time learning things just for the sake of it, tending to be more driven by necessity rather than idle curiosity. I've been doing some work based around a small CPU core and started getting frustrated with the way the CPU was architected. This led me to start considering designing my own CPU, just for fun. Partly as a motivation to crack open a Hennessy & Patterson book that I've been meaning to read for a few years, partly to see if I can do it, partly as a vehicle to hang all those other ideas upon.

I've been looking around the web, browsing on OpenCores and finding humbling projects, such as HomebrewCPU, which is a Minix-compatible CPU entirely constructed from discrete gates. You can even browse the web pages that it is serving or telnet in to it! To my way of thinking, that is slightly nuts - impressive, but nuts all the same - five wire-wrapped boards to debug. My background is in FPGAs and that seems the perfect technology for this sort of exploration - I'm also thinking along the way that I might be able to play with synthesisable verification or FPGA enhanced verification/ emulation as well as possibly using this as a platform for a reconfigurable architecture. Lots of ideas, potential and possibilities. It will also give me a chance to re-engage with FPGA technologies and learn about more about the state of those tools. The various tools are getting to a fairly mature point and a simple pipelined CPU shouldn't require too much work but still be complex enough to do interesting things with.

I've been looking at Xilinx and Altera to get an understanding of their current tool flows and trying to work out language support and maturity - which would be the best option for systemVerilog, where the best simulation options are and that kind of thing. No real conclusions yet, but both have trial versions of what appears to be a complete tool chain, so I will probably drive a small example through both flows as a pipe cleaner.

Then of course there are the more fundamental religious issues - CISC or RISC, what ISA to use. Roll my own, pick up an already defined but open architecture, or something in between? I'm looking for suggestions in this respect - I know ARM are quite litigious when it comes to cloning their ISA, so I'll be avoiding that, but OpenSPARC might well be a good option. Any other suggestions? I'm not sure if the early MIPS ISAs are cloneable without problems? Maybe I could go really back to my roots and implement a Z80 architecture. The advantage of picking on an existing ISA is that the tools come mostly for free. While porting gas and gcc to my own ISA could also be an interesting experiment and learning experience, it would probably be more of a distraction than I want.

I am a fan of the Python language and tend to write most of my personal projects in it. As a result, I'm intrigued by the potential for writing the core in Python, using some of the available extensions and libraries. Two packages seem to already exist, MyHDL and PyHVL. MyHDL is Python extension to let you express parallel behaviour that can then be automatically translated to verilog or VHDL. PyHVL provides the other piece of the Python puzzle, enabling high-level verification in Python. So potentially I could do the design and verification in Python then drive through into an FPGA flow for implementation. JL jokingly mentioned the potential for an OVM port to Python but maybe it isn't such a crazy notion. The thing that Python is fantastic for is expressing complex ideas quickly and without a lot of fuss or housekeeping. From the verification perspective it seems to be a perfect match as I can focus more on the testing, and less on the language. I'm a bit more skeptical about using it on the design side but I think it might be worth a look.

To kick things off, I found the description for a minimal CPU on opencores. This is a really basic 8-bit processor, 4 op codes, couple of registers and a very simple, small architecture, yet it can still do some useful function. This evening I wrote a Python ISS for it, just to prove out some ideas. Took about an hour to get a working cycle-based ISS together for the architecture. Of course, immediately the next thing you need is an assembler and that was fairly easy to put together in Python too. Nothing particularly complex, but a two pass assembler that supports labels and constants and generates code that runs on the core. I'm able to assemble and run the example greatest common divisor (Dijkstra's algorithm) described in the paper and it has given me a good indication of the direction to go. So far, my couple of hour project can assemble and execute the following code:

start:
    NOR allone      ; akku == 0
    NOR b
    ADD one         ; akku = -b

    ADD a           ; akku = a - b
                    ; Carry set when akku >= 0
    JCC neg

    STA a

    ADD allone      
    JCC end         ; A=0 ? -> end, result in b

    JCC start

neg:
    NOR zero
    ADD one         ; Akku = -Akku

    STA b
    JCC start       ; carry not altered

end:
    JCC end

a: 10  ; a & b are the two values to consider
b: 30

allone: 0xff  ; various constants
zero:   0
one:    1

Next step is to describe this trivial core in a synthesisable form and see how I get on running it through one or two FPGA flows. A few tests and some verification could be useful too! For FPGAs I'm much more used to the general suck it and see style of testing that is the norm. Synthesize, place and route and see if it works. In the last several years I've been working on much larger ASICs so have certainly seen the value of more robust verification and I think FPGA technology has probably spent too much time as the wild frontier of design robustness and testing. As this project progresses I want to explore what the best balance is for testing and how the test environments can use the FPGA to accelerate testing along the way.

So plenty of things up in the air but I think this could be fun.


There are comments.